It's All in Your Head: Neurotechnological Lie Detection and the Fourth and Fifth Amendments
Holley, Benjamin, Developments in Mental Health Law
"Historically, fundamental decisions regarding the implications of new technologies have occurred very early in the life cycles of those technologies.... These technologies ... evolved considerably since the courts originally addressed them, however, the mere existence of these opinions have tended to foreclose fresh analyses." (1)
A young woman in India was recently charged with murder. (2) Although every murder case is unique, the facts of this one were not particularly sordid or memorable. The prosecution alleged that the defendant, conspiring with her current fiance, poisoned her former fiance with arsenic. (3) During the criminal investigation, the defendant took--and flunked--two lie detector tests.
The first test employed a polygraph. Although this technology is often used in criminal investigations, it is not particularly reliable, (4) and its results are generally not permitted in guilt determinations in the United States. (5) The second test, however, catapulted this otherwise generic case into international headlines. (6) The test analyzed the defendant's "brain waves," and purported to prove that she possessed actual knowledge of the crime. Relying heavily on these two sets of results, the judge convicted the defendant and sentenced her to life in prison. (7)
Similar brain-based lie detectors are in development in the United States, with multiple private companies marketing their use as lie detectors (8) and some courts already considering the propriety of admitting their results as evidence. (9) Such advancements bring what was once science fiction into the courtroom and raise several challenging questions. For example, is recording brain waves a search under the Fourth Amendment? Does the Fifth Amendment's self-incrimination clause protect defendants from compelled lie detection? Courts will face these questions soon; how they initially resolve the issues will significantly affect the subsequent use of this technology. (10) Exploring these issues and anticipating the appropriate answers now is thus of prime importance.
To answer these questions, this Article proceeds in five parts. Part I examines various types of Neurotechonological Lie Detection (NTLD) (11) that might be used in criminal investigations in the near future, paying particular attention to court decisions evaluating the technology and the differences that may be relevant to future courts considering their use. Part II examines the types of situations in which constitutional questions may arise, leading to Part III, which considers how the Fourth Amendment applies to NTLD, and Part IV, which analyzes the Fifth Amendment questions NTLD raises. Finally, Part V summarizes the conclusions reached and suggests directions for future research.
II. Types of Neurotechnological Lie Detection (12)
Scientists are investigating several different types of neurotechnology-based lie detectors, each of which rely on somewhat different technology and different assumptions about lying, but all of which have the same goal: determining whether or not the subject is telling the truth. The implications attending the development of a successful device are profound, especially in the context of criminal law.
Before examining specific types of NTLD, however, it is necessary to discuss limitations applicable to all of them. First, it is important not to overstate the capability and accuracy of these devices. (13) Many of them, particularly fMRI (14) and "brain fingerprinting," (15) show great promise, but none is as yet widely accepted by the legal or medical communities and none is foolproof. Much work remains to be done before these devices should move from the laboratory to the courtroom. Scientists continue this work at a rapid pace, however, and one should not be lulled into thinking that NTLD is merely a distant fantasy. (16)
Second, and closely related to the first point, it is important to understand what NTLD is and is not. …