By Aobo Dong
Would you be willing to accept a professional care-giving robot as a replacement to a human companion when your loved ones are far away from you? During last week’s HLS Health Law Workshop, Professor Belinda Bennett provided a great overview on the imminent age of machine and automation and the legal and ethical challenges the new era entails, especially in health care law and bioethics. After discussing three areas of potential health law complications, Professor Bennett argued that the field of health law is undergoing a transition from the “bio” to the “digital” or “auto,” and that instead of playing a catching-up game with rapidly evolving technologies, more focus should be placed on learning from past and existing laws and regulations in order to meet new demands from the “second machine age.” However, I wish to propose a closely-related but alternative paradigm, that is, using the issues raised by new technologies as a vehicle for improving existing laws and reshaping social norms that once made existing laws inadequate or flawed. I will elaborate on my point through the author’s own example of elderly care.
Despite the fact that the author advocates a revisionist approach for thinking about health law and technology, her paradigm is still about laws serving the needs and solving concerns of the tech industry intersected with health care. I wonder whether it would be productive to view the issue from the opposite direction, that is, how could new technologies and the challenges they raise inform us about existing laws (revealing blind spots or providing opportunity to improve unjust/unfair/discriminatory laws). Viewed this way, we could not only strengthen connections between past laws and future technologies, but also be guided by a clearer sense of how future legal reforms and regulations could redress past neglect and meet new challenges.
For example, Bennett points out that changing global demographics have led to increasingly challenging programs in elderly care, especially those with dementia. She identifies social isolation as a major challenge, and sees technologies potentially meeting this challenge by “enabling greater contact with family members, carers and the community” (P. 7). She also points out that technologies like robotic companions could become “a form of deception” for certain patients and even outsource care and social interactions, once performed by real humans, to robots. However, these ethical programs are not created by these technologies; in fact, they were already present in the elderly care industry before smartphones were invented. Family members already “outsourced” or relegated the tasks of social interactions with their own parents to professional staff members at elderly care facilities. Some even dropped off their parents suffering from dementia to nursing homes without their full consent, or even deceived them by making false promises that they would visit them regularly. Since laws regulating filial piety is inconceivable in western societies, social norms/community values (discussed briefly by the author) dictated such a reality for elderly care. Under these preexisting conditions of the elderly care industry, technologies that further relegate human tasks to machine would remove the human element and connection from dementia patients even more.
So how can the law redress past wrongs with the aid of opportunities created by new technologies? The author acknowledges community values as a regulatory modality that could render laws outdated. Nonetheless, laws can also help reshape and inform community values. A capability approach to human rights would see human connection as a fundamental capability, even for dementia patients. New technologies capable of bringing relatives thousands of miles away to the patient’s room virtually could enhance the patient’s capability for human interactions and meaningful connections. Laws could encourage the use of such technologies for relatives with legitimate reasons preventing them from visiting their parent in person, but not to fall into the trap of outsourcing interactions to virtual mediums for those neglecting their own relatives. China, for example, has made “Elderly Rights Law” requiring children to visit their parents, a move that can be interpreted as “legislating filial piety.” But with the help of technology, visits no longer need to happen physically. The spread of realistic Virtual Reality technologies and AI-mangaged chat-bot systems based on the conversational habits of a real person could open whole new possibilities as well as ethical dilemmas.
Moreover, “big data” monitoring dementia patients’ conditions and care-giving quality could help increase accountability in care-giving facilities. More and more nursing homes are using binding pre-dispute arbitration agreements to mitigate legal liability in case relatives accuse them of neglect that led to patients’ death. If family members could be given tools to track the conditions of their loved ones far away, such pools of data could be used in court or arbitration to substantiate their case. Consequently, the behaviors of the nursing staff could be reshaped and evaluated by higher ethical and professional standards. Nonetheless, issues of privacy for the patient necessarily follow. But the sheer availability of data monitoring could add to the negotiating power of patients and family members so that they would be less likely to be pressured or coerced into these unfair arbitration agreements.