The 21st Century Cures Act was passed with support from both sides of the aisle (imagine that!) and signed into law by then-President Obama late last year. This ambitious legislation drives action in areas as diverse as drug and device regulation and response to the opioid epidemic. It also tackles the issue of how to make data more broadly available for research use and clinical purposes. In our recently published GIM article, “Sharing data under the 21st Century Cures Act,” we examine the Act’s potential to facilitate data-sharing, in line with a recent position statement of the American College of Medical Genetics and Genomics. We highlight a number of provisions of the Act that either explicitly advance data-sharing or promote policy developments that have the potential to advance it. For example, Section 2014 of the Act authorizes the Director of National Institutes of Health to require award recipients to share data, and Section 4006 requires the Secretary of Health and Human Services to promote policies ensuring that patients have access to their electronic health information and are supported in sharing this information with others.
Just as relevant, the Act takes steps to reduce some major barriers to data sharing. An important feature of the Act, which has not been extensively publicized, is its incorporation of provisions from legislation originally proposed by Senators Elizabeth Warren and Mike Enzi to protect the identifiable, sensitive information of research subjects. Senator Warren, in particular, has been a vocal advocate of data sharing. Arguably, one of the biggest barriers to sharing is public concern about privacy. The relevant provisions address this concern chiefly via Certificates of Confidentiality. Among other things, the Act makes issuance of Certificates automatic for federally-funded research in which identifiable, sensitive information is collected and prohibits disclosure of identifiable, sensitive information by covered researchers, with only a few exceptions such as disclosure for purposes of other research. These protections became effective June 11, 2017. While NIH has signaled its awareness of the Act, it has not yet updated its Certificates of Confidentiality webpage. Continue reading →
A few weeks ago I ran across this BuzzFeed post, telling the story of Corey Mason, a 14 year old male to female Trans teenager who was filmed getting her first pack of estrogen hormones. Her mom Erica, who uploaded the video to Facebook and YouTube, spurred a social-media discussion on the topic of hormonal treatment for Trans children and youth.
Erica said the vast majority of reactions were very supportive. On the other hand, different views and opinions were put on the table as well, even from people who ally completely with Trans identity politics. One of them, a Trans woman, said she fears from rushing (perhaps gay) teenagers into irreversible treatments, as most Trans kids “GROW OUT OF IT”. This position was also taken by Alice Dreger, a Bioethicist and a historian writing on Intersex issues, in describing the uneasy choice between the two models available at the moment: On the one hand you have the ‘therapeutic model’ offering mental health support to the Trans person and/or family, to help ease up the tensions caused by gender identity dysphoria (GID). This model aims to relax the dysphoria and so avoids any medical irreversible interventions. On the other hand, you have the ‘accommodation model’ asserting there’s nothing wrong with the trans person and/or his/her family, and so offers medical interventions to accommodate it.
As the nation braces for possibly more Ebola cases, civil liberties should be considered, including patient privacy. As news media feature headline-grabbing stories about quarantines, let’s think about the laws governing privacy in healthcare. Despite federal laws enacted to protect patient privacy, the Ebola scare brings the vulnerability of individuals and the regulations intended to help them into sharp relief.
In 1996, Congress enacted the Health Insurance Portability and Accountability Act (HIPAA) to protect patient privacy. Specifically, HIPAA’s Privacy Rule requires that healthcare providers and their business associates restrict access to patients’ health care information. For many years, the law has been regarded as the strongest federal statement regarding patient privacy. But it may be tested in the wake of the Ebola scare with patients’ names, photographs, and even family information entering the public sphere.
Ebola hysteria raises questions not only about how to contain the disease, but also to what extent Americans value their healthcare privacy. What liberties are Americans willing to sacrifice to calm their fears? How to balance the concern for public welfare with legal and ethical privacy principles? For example, will Americans tolerate profiling travelers based on their race or national origin as precautionary measures? What type of reporting norms should govern Ebola cases? Should reporting the existence of an Ebola case also include disclosing the name of the patient? I don’t think so, but the jury appears out for many.
On September 9 Apple is hosting its ‘Wish We Could Say More’ event. In the interim we will be deluged with usually uninformed speculation about the new iPhone, an iWatch wearable, and who knows what else. What we do know, because Apple announced it back in June, is that iOS 8, Apple’s mobile operating system will include an App called ‘Health’ (backed by a ‘HealthKit’ API) that will aggregate health and fitness data from the iPhone’s own internal sensors, 3rd party wearables, and EMRs.
What has been less than clear is how the privacy of this data is to be protected. There is some low hanging legal fruit. For example, when Apple partners with the Mayo Clinic or EMR manufacturers to make EMR data available from covered entities they are squarely within the HIPAA Privacy and Security Rules triggering the requirements for Business Associate Agreements, etc.
But what of the health data being collected by the Apple health data aggregator or other apps that lies outside of protected HIPAA space? Fitness and health data picked up by apps and stored on the phone or on an app developer’s analytic cloud fails the HIPAA applicability test, yet may be as sensitive as anything stored on a hospital server (as I have argued elsewhere). HIPAA may not apply but this is not a completely unregulated area. The FTC is more aggressively policing the health data space and is paying particular attention to deviance from stated privacy policies by app developers. The FTC also enforces a narrow and oft-forgotten part of HIPAA that applies a breach notification rule to non-covered entity PHR vendors, some of whom no doubt will be selling their wares on the app store. Continue reading →
Anonymity is not just an aspect of privacy and recognizing their difference reveals a powerful and poorly understood set of legal tools for facilitating and controlling the production of public goods. This is the central claim of my newest article (SSRN draft available here).
Three examples illustrate the scope of the under-explored ways in which anonymity is currently used in our law.
The first is from June 1997, when many residents in the Boston neighborhood of Allston learned to their anger that Harvard University had spent the previous 8 years secretly acquiring over 50 acres of Allston real estate. It did so using buying agents, which can generally protect their principal’s anonymity—even by falsely stating that they are not agents.
The second is from Election Day 2012, when many voters who had shared photos of their completed ballots on Facebook and Twitter learned, to their surprise, that they had violated their states’ elections laws in doing so. They did not know that anonymity in voting was not just a right, but also a requirement.
The third is from a 2006 lawsuit over the control of thousands of tissue samples being used in research at Washington University. When many of the research participants sought to withdraw their tissue from future research, in response to what they saw as a breach of their consent, they were shocked to learn that the university could refuse and extinguish their rights of withdrawal by anonymizing their tissue samples.
These varied uses of anonymity in our law—as a right when purchasing land, a requirement in voting, and a trigger than extinguishes rights in biomedical research—may appear to be unrelated. But I argue that they are in fact all part of a cohesive and previously unrecognized class of rules that use anonymity not to protect privacy, but rather to incentivize or control the production and circulation of information and other socially desirable goods. Continue reading →
According to an article in the NYT, an artist has collected DNA samples from litter on sidewalks, such as chewing gum and cigarette butts, and used those samples to extract and sequence DNA that she then used to make computer models of their owners’ faces. She then printed 3-D masks that she is showing at her upcoming exhibit called Stranger Visions. The artist hopes her exhibit will spark a dialogue over genetic surveillance.
[w]hile staring at the wall of her therapist’s office, the artist Heather Dewey-Hagborg noticed a strand of hair stuck in a hanging print. Walking home, she noticed that the subways and sidewalks were littered with genetic material on things like chewing gum and cigarette butts, some still moist with saliva. Curious about what she could learn, Ms. Dewey-Hagborg began to extract and sequence DNA from these discarded materials. Then — and here it gets a little eerie — she began to make computer models of their owners’ faces, using genetic clues to print 3-D masks that she concedes “might look more like a possible cousin than a spitting image.” Hanging these portraits along with the original samples, she says, is “a provocation designed to spur a cultural dialogue about genetic surveillance.” After the June exhibitions, Ms. Dewey-Hagborg will show her work early next year at the New York Public Library. She has also collaborated on a tongue-in-cheek project called DNA spoofing, which purports to offer ordinary people some techniques to avoid detection by scrambling their genetic material.
Talk and exhibition at Genspace in Brooklyn on June 13. Exhibition at QF Gallery in East Hampton, N.Y., opens June 29.
I posted in June about the fact that my social security number (and possibly other personal information) had been downloaded to an unknown site in Eastern Europe as part of a large security breach from the Utah state health department. In connection with that breach, I have filed a complaint with the Office for Civil Rights at HHS (OCR).
I thought readers might like to know, however, that the process of complaining about a HIPAA violation to OCR is cumbersome indeed. There are forms available on line, here. You can open them, and fill in information, but you can’t save them. If you close the form, you lose all the data. You also can’t file them online–you have to print them out and fax them off. (You are helpfully told, however, to “print out a copy for your records.”) I finally figured out that if you save the form to notepad before you fill it out, you can then email it to HHS–but this required a telephone call to the appropriate regional office of HHS.
When I pointed out to OCR that this process is not exactly user-friendly, they indicated that they are “working on it.” Imagine someone without a home computer, or a home fax machine, or a home printer, using public library computers in the effort to reach OCR about what they regard as a significant problem with their health information. Surely in a world of blue buttons and digital Medicare strategies, see Responsive Design and the New Medicare.gov, the ability to file a complaint about possible violations of health information security or confidentality should be an easier online process.
Establishment of the infrastructure needed for the efficient, accurate, and secure exchange of health information is a crucial piece of improving care in the US. Exchange fosters the ready availability of information, reducing redundancy and hopefully improving care quality. To this end, proposals for a National Health Information Network were highly touted during the Bush Administration and continue to be supported by the Obama Administration, the Office of the National Coordinator for Health Information Technology (ONC) was established in 2004, and several federal advisory committees (the ONC Policy Committee and the ONC Standards Committee) were established by Congress in the HITECH Act in 2009. Yet progress towards health information exchange remains halting at best–some hypothesize because of resistance within the private sector itself. Recent developments at ONC are not encouraging.