Believe it or not, in a few years, someone else really might read your thoughts—with or without your permission. Science, not science fiction, is inching toward that kind of technology.

Currently, there are over a dozen devices in use by government agencies and the private sector intended for testing an individual’s truthfulness. These devices range from the well-known polygraph to the latest and most disturbing device, functional magnetic resonance imaging (fMRI). That gizmo compares the brain image of an individual in response to a given question, to millions of stored brain images.

At the National Research Council’s request, I presented a report for a workshop sponsored by the Defense Intelligence Agency (DIA) and the Office of the Director of National Intelligence on these new devices’ ethical challenges. Despite the lure of new technologies, these unproven and invasive devices may pose serious threats to our liberties.

Sure, this technology could yield great benefits. But it also generates risks, along with legal and ethical concerns. The medical profession struggles to strike a balance between risks and benefits when prescribing a particular therapy, drug, or device. Similarly, our society must determine the potential risks this newfangled technology poses and weigh in on how to use it.

Exposing the human brain to these new devices creates both physical risks and political ones. Mind-reading by authorities or the private sector could easily mean a potential loss of freedom. There are then many questions to be resolved before the government or the marketplace adopts this technology. What kind of an informed consent should be granted? What kind of information would be divulged?

Whenever new technology emerges, entrepreneurs stand ready to make money. Among the entrepreneurs offering these technologies, some are peddling snake oil. Many of the new mindreading devices lack any validation of safety or usefulness for the intended purpose.

Even the old technology is questionable. The polygraph, referred to as the lie-detector test, is commonly used by government and industry to test for truthfulness for a wide variety of purposes. Yet the National Research Council’s report in 2003 questions the scientific basis of the polygraph, its accuracy, and the validity of its measurement.

Many credible scientific publications have questioned the basis that these new technologies can discern if a subject is lying. Scientists state unequivocally that social behavior is highly influenced by cultural context. The results of these new tests are being used to determine the motivation, veracity or danger of an individual with actions being taken based on the perceived threat posed by that individual. As these new neuroscience technologies are considered for use for intelligence, counterintelligence, and forensic science, moral and legal issues will be exacerbated. Do we really want our tax dollars paying for the arrest of villagers in foreign countries, based on their response to a 10-second test with a device that’s known to have high false-positive responses?

The public must be skeptical. The risk of violating the rights of subjects on whom the technology is used may prove unacceptable.

It’s heartening to see that our intelligence and defense agencies are willing to hear of the potential ethical challenges of this new technology. It’s also essential that they consider the potential risks these new devices pose before they deploy them to support U.S. policies at home and abroad.

Print Friendly, PDF & Email

Adil E. Shamoo is the editor-in-chief of the journal Accountability in Research, a senior analyst for Foreign Policy In Focus, and a professor at the University of Maryland School of Medicine.

OtherWords commentaries are free to re-publish in print and online — all it takes is a simple attribution to OtherWords.org. To get a roundup of our work each Wednesday, sign up for our free weekly newsletter here.

(Note: Images credited to Getty or Shutterstock are not covered by our Creative Commons license. Please license these separately if you wish to use them.)