The Truth Hurts

Scientists question voice-based lie detection

Truster-Pro and the Vericator may sound like devices Wile E. Coyote would order from the Acme Co., but they are real technologies for detecting lies. Unlike the traditional polygraph, which zeroes in on factors such as pulse and breathing rate, these analyzers aim to assess veracity based solely on speech.

THE TRUTH HURTS Many police forces have turned to voice-based lie detectors, but scientists are finding that these polygraph alternatives don’t reliably tell fact from fiction. Michael Morgenstern

Police departments shell out thousands of dollars on such devices — known collectively as voice stress analyzers — in an attempt to tune in to vocal consequences of lying. Airports are considering versions for security screening purposes, and insurance companies may employ the polygraph alternatives to detect fraud.

But beyond their crime-fighting objective, these tools have something less noble in common with their predecessor: a poor track record in actually telling truth from deception.

Scientists evaluating Truster-Pro, the Vericator and newer analyzer models repeatedly report lackluster results. Now research finds that two of the most commonly used voice stress analyzers can discern lies from truth at roughly chance levels — no better than flipping a coin.

“Quite frankly, they’re bogus. There’s no scientific basis whatsoever for them,” says John H.L. Hansen, head of the Center for Robust Speech Systems at the University of Texas at Dallas. “Law enforcement agencies — they’re spending a lot of money on these things. It just doesn’t make sense.”

A lackluster alternative

Many agencies have been seeking alternatives to the polygraph, especially following a 2003 National Research Council report that concluded that the physiological responses measured, such as increased heart rate, can identify stress but not pinpoint deception. Champions of voice stress analyzers often cite this report among other criticisms of polygraphs as a reason to switch to voice-based lie detection. The National Institute for Truth Verification — a company based in West Palm Beach, Fla., that makes a widely used device called the Computer Voice Stress Analyzer — has a page on its website dedicated to denigrating this traditional lie detector, titled “Polygraph Failures Continue to Mount.”

But the institute fails to mention the same report’s conclusions about alternatives to the polygraph, including voice analyzers. Research offers “little or no scientific basis for the use of the computer voice stress analyzer or similar voice measurement instruments as an alternative to the polygraph for the detection of deception,” the report noted.

As with the old lie detector, creators of voice analyzers usually avoid direct claims that the units detect deception, speech perception expert James Harnsberger said in April in Baltimore at a meeting of the Acoustical Society of America. Instead, the developers contend that physiological changes that occur when someone is lying trigger consistent, readable changes in voice. “There’s an assumption that there’s a direct mind-mouth link,” said Harnsberger, of the University of Florida in Gainesville.

Speech does in fact change when a person is under stress, both in frequency and in the amount of time spent on segments of words, says Hansen. But, as with the polygraph, distinguishing stress related to deception from stress related to fatigue, anxiety or fear is not so easy.

“No one has identified an acoustic signature that is unique to deception,” says Mitchell Sommers, director of the Speech and Hearing Laboratory at Washington University in St. Louis.

Two large studies, one conducted in a jail and another in a lab, suggest that the two most widely used voice stress analyzers haven’t pinpointed such a signature, either. 

One voice analyzer — Layered Voice Analysis, created by the Israel-based company Nemesysco — purports to use more than 8,000 algorithms to tune in to three states of mind: excitement, stress and cognitive dissonance (the psychological discomfort that comes with holding two conflicting views at once). A second, the Computer Voice Stress Analyzer, claims to detect inaudible changes in “microtremors” in the voice of a lying person. Versions of both systems can cost more than $10,000 with training.

Numbers speak truth

In the jailhouse study, researchers led by Kelly Damphousse of the University of Oklahoma in Norman interviewed a random sample of 319 arrestees during booking in an Oklahoma county jail. The team asked the men about recent use of drugs, including cocaine, marijuana, PCP and methamphetamine, and researchers dissected responses with both voice analyzers. After the interview, the arrestees’ urine was tested for actual drug use.

Both voice analyzers got poor marks, write Damphousse and his colleagues in a 2007 report for the Department of Justice. All told, fewer than one-sixth of the lies were detected. LVA spotted about 10 percent of lies, while CVSA got nearly 20 percent. They were better at detecting truths, correctly identifying between 85 and 95 percent. The remaining truths were still falsely labeled as lies.

The technologies didn’t fare much better in the lab, Harnsberger reported at the Acoustical Society meeting. As part of a team of Florida researchers, Harnsberger underwent training for both technologies. Working with company representatives, the researchers conducted a study where subjects were video recorded telling the truth and telling lies under various levels of stress. For a very high-stress lie, the participants were asked to make a statement that they strongly disagreed with; topics included sexual orientation and gun control. Participants were also told that the video would be shown to their peers and that they should expect an electric shock during the statement.

One technology caught lies at rates similar to chance, and the other did somewhat better, Harnsberger and colleagues reported at the meeting and in two papers in the Journal of Forensic Science. But both detectors also falsely labeled true statements as lies at similar rates. These false positives, which are often unreported in studies and left out of company descriptions of the technologies, are key for evaluating merit, Harnsberger noted.

“A common mistake is to only report how many lies were successfully detected,” Harnsberger says. “You could write ‘lie’ on a piece of paper and hold it up every time someone speaks to you, and you will detect 100 percent of the lies.”

Amir Liberman, CEO of Nemesysco, likens the technology to a microscope; it doesn’t detect disease per se, but it’s a tool for exploration. He adds that the circumstances and the interrogator are crucial to success. Still, Liberman says explicitly that Layered Voice Analysis can do what researchers say it can’t: “LVA differentiates between stress and lies,” he says. How exactly, he can’t disclose. The National Institute for Truth Verification declined requests for an interview.

Harnsberger has repeatedly made the case to policy makers that voice analyzers don’t live up to their manufacturers’ claims. And currently only two “credibility assessment” devices have been approved for use by the Department of Defense: the good old polygraph and a next-generation version that also evaluates physiological factors.

But manufacturers of the voice stress analyzers continue to lobby for their products, says Harnsberger. Such efforts may not be in vain. In a statement, Defense Department spokesperson René White said the department “continues to conduct research on and evaluate additional potential credibility assessment tools.” And according to USAspending.gov, the National Institute for Truth Verification, maker of the Computer Voice Stress Analyzer, has received more than $1.6 million in Defense Department contracts since 2005.

Though the technologies apparently don’t tell truth from fiction, they may have merit as props. The jailhouse study followed up work that had asked arrestees about drug use, that time without a lie detector in the room. Comparing the two studies revealed that more than three times as many drug users lied when no device was present than when one was.

“They may be very useful for eliciting admissions,” Harnsberger says. “That’s not the same as detecting lies.”

From the Nature Index

Paid Content