Thanks to neuroscience, some truly daunting manipulations of the human brain are under way.
Deep brain stimulation is being explored to treat resistant depression. Functional magnetic resonance imaging (fMRI) and brain fingerprinting are being developed to detect lying—and are touted by some as being superior to polygraphs, whose evidence is banned in most legal jurisdictions. The brains of some quadriplegia patients have been connected to computers to help them write. A vaccine to prevent cocaine addiction is in phase 3 clinical trials.
So reported speakers at APA's annual meeting last month at a session titled“ Neuroethics: Ethical Challenges at the Cutting Edge of Neuroscience.”
These various techniques are also at different stages of development, the speakers indicated.
For example, the evidence that deep brain stimulation can counter intractable depression is very preliminary and short term, Laura Dunn, M.D., an associate professor of psychiatry at the University of California, San Francisco, said.
Meanwhile, fMRI neuroimaging is a way to examine the neural activity of the brain and the flow of blood throughout the brain's vascular system simultaneously. The use of fMRI for lie detection is rooted in the observation that lying is an active process that engages areas of the brain not involved when a person responds truthfully, Paul Appelbaum, M.D., explained. Appelbaum, a former APA president, is the Elizabeth K. Dollard Professor of Psychiatry, Medicine, and Law and director of the Division of Psychiatry, Law, and Ethics at Columbia University. Yet data demonstrating the technique's effectiveness are “less than terrific so far,” he said. There is also the question of whether the technique can detect intermediate veracity states such as omissions, rather than out-right lying, he pointed out.
As for brain fingerprinting, people's brain waves supposedly spike when they recognize a familiar scene or piece of information, though not when they have never previously seen the scene or piece of information, Appelbaum explained. But there are no peer-reviewed data demonstrating that the technique works, he said—at least not yet.
Henry Greely, J.D., a professor of law at Stanford University, concurred with Appelbaum's comments: these techniques “are not ready for prime time yet.”
Yet if evidence underscoring their effectiveness builds, Appelbaum noted, the results of these techniques might eventually become admissible in various courts. In fact, brain-fingerprinting results for one case have already been admitted, he said—in an Iowa court. The defendant's lawyer used the results to help make the case that the defendant was telling the truth. One can imagine that the plaintiffs' lawyers may be able to use the technique as well, Appelbaum noted.
Not surprisingly, use of such techniques is raising all sorts of ethical questions, the speakers agreed.
For example, even though deep brain stimulation has been used to treat Parkinson's patients, Dunn worries about adverse side effects that may result when it is used to treat depression. “We don't know all the risks,” she stressed. “We also want to know whether people who consent to such research do so because they are desperate.”
And even if fMRI neuroimaging and brain fingerprinting are found to be highly effective at detecting lying, should they be required in all cases involving truthfulness? Appelbaum wonders. He raised the same question at the annual meeting of the American Association of Directors of Psychiatric Residency Training this spring (Psychiatric News, April 18).
And as Greely pointed out, if a vaccine to prevent cocaine addiction is approved by the Food and Drug Administration (FDA), should all young people then be obliged to get it? (The situation here somewhat parallels recent FDA approval of a vaccine for cervical cancer, which raised the question of whether all girls should be required to have the vaccine.)
Still more human-brain developments—say, using brain scans to determine whether people feel pain or not, to predict Alzheimer's, and to determine which adolescents might develop schizophrenia—are looming large, Greely reported. These developments will raise still more ethical questions, he stressed.
Indeed, “neuroethics” is a relatively new term and a relatively new field, Appelbaum attested. ▪