In the early days of living wills — the 1970’s and 1980’s – a major objective was to avoid being maintained on burdensome medical machinery in a highly debilitated status at the end stage of a fatal affliction. The contemporaneous legislation endorsing advance directives was typically geared to “terminal illness” (meaning likely death within 6 months). The distasteful specter was a moribund patient tethered to burdensome interventions like a respirator or a dialysis machine despite an unavoidable, looming demise. A common short-form living will rejected life support that “only prolongs the dying process” for a patient in “a terminal condition.”[i]
Another specter was being medically sustained in an utterly dismal quality of life – such as permanent unconsciousness without awareness or interaction with one’s environment. The contemporaneous legislation explicitly authorized advance directives seeking to avoid medical maintenance in a permanently vegetative state. And several landmark cases authorizing surrogate end-of-life determinations involved permanently unconscious patients. See Quinlan (N.J. 1976); Brophy, (Mass. 1986); Browning (Fla. 1990); Schiavo (Fla. 2005).
With the increasing prevalence of Alzheimer’s disease and similar degenerative dementias, the focus of advance directives has changed for some people. The primary specter is neither an unavoidable looming demise nor the insensate limbo of permanent unconsciousness. Rather, the emerging concern is protracted maintenance during progressively increasing cognitive dysfunction and helplessness. For some, being mired in a demented state is an intolerably degrading prospect well before the advanced stage when the person no longer recognizes loved ones and is totally uncomprehending.
For people like me who see even moderate dementia as an intolerably demeaning status staining their life image, their advance directive may seek to facilitate death by declining even simplistic medical interventions like antibiotics. Our hope is that death will soon ensue when an infection is left untreated or when artificial nutrition and hydration is withheld in the face of an eating disorder. Continue reading →
I am obsessed with avoiding severe dementia. As a person who has always valued intellectual function, the prospect of lingering in a dysfunctional cognitive state is distasteful — an intolerable indignity. For me, such mental debilitation soils the remembrances to be left with my survivors and undermines the life narrative as a vibrant, thinking, and articulate figure that I assiduously cultivated. (Burdening others is also a distasteful prospect, but it is the vision of intolerable indignity that drives my planning of how to respond to a diagnosis of progressive dementia such as Alzheimers).
An alternative strategy would be to allow myself to decline into incompetency, but beforehand to dictate, in an advance directive, rejection of future life-sustaining medical interventions. This strategy would probably work as applied to serious maladies such as kidney disease, lethal cancer, or congestive heart failure. The disturbing issue then becomes timing. The onset of such serious maladies is fortuitous and years of lingering in dementia might precede my demise.
A further alternative would be to seek to accelerate my post-competence demise by declining not only major medical interventions such as mechanical respirators or dialysis, but also more simplistic items like antibiotics, antiarrhythmics, and artificial nutrition and hydration. My envisioned scenario is that infection would occur early (via urinary tract, skin, or pneumonia) and that this condition, left untreated, would precipitate my death. (My advance instructions would allow palliative but not curative measures.)
This past Sunday, a group of researchers reported in the journal, Nature Medicine, a preliminary technique that uses variation in blood levels of 10 fats to predict the likelihood that elderly individuals would develop mild cognitive impairment (MCI) or Alzheimer’s Disease in the following 2-3 years. The sample size was small and the results may not generalize beyond the narrow age-range and demographics of the study group (i.e. the assay is far from ready for “prime time”), but the study is an important first step towards a lower cost (vs PET imaging) and less invasive (vs spinal tap) predictive biomarker of cognitive decline*. Its publication has also triggered a flurry of discussion on possible ethical ramifications of this sort of blood biomarker. I will not attempt to address these ethical issues specifically here. Rather, I seek to highlight that how ethically troubling one views the technology to be may depend partly on the sort of knowledge one thinks these biomarkers reveal (applied epistemology at its best).