agencies and big business are scrambling for “the next big thing”
in lie detection. And two U of U professors may have what they’re
During a long road trip from Salt Lake to Washington state, there may not be much to talk about. Sure, the scenery can be pretty spectacular, with the Columbia River Basin flashing by outside the window and the Western landscape unfurling across the horizon like new flannel sheets. But sagebrush and sculpted hillsides can only hold your attention for so long. And that’s when the shoptalk inevitably starts.
In 2003, as University of Utah educational psychology professors John Kircher and Doug Hacker sped toward Mount Rainier for a climbing trip, that’s exactly what happened. Before long, they were discussing “how to build a better mousetrap”—or, more precisely, how to build a better lie detector. Over the course of subsequent climbing trips, the two hashed out plans to devise a reliable lie detector based on eye movement. Back at the U, they enlisted the help of colleagues Anne Cook and Dan Woltz, recruited assistants and research volunteers, and by 2005, had attracted the interest of the U.S. Department of Defense and the intelligence community.
Everyone, it seems—from mega-corporations to G-men—are out to find more effective ways to nab liars, and the two University of Utah profs may just be onto something, although they’re still crunching numbers and collating data. They’re also well aware that they’ve waded into controversial waters, as lie detectors (or polygraphs) have become legal and cultural targets, with some swearing by them as a useful tool, others dismissing their efficacy outright. But polygraph supporters say that a more accurate system is certainly possible and it’s only a matter of time before someone comes up with one.
The quest for a verifiable, dependable method for identifying a lie and its perpetrator is an ancient idea—dating from the Middle Ages, when suspected liars were crammed into bags and tossed into a mill pond (floaters were liars; sinkers were good, honest folk—albeit dead, honest folk). But technological advances have prodded federal agencies to heat up the search for an accurate, reliable lie detector. So the hunt for a better fib finder is on.
The Eyes Never Lie (Or Do They?)
This goes beyond the simple trope that says a liar’s eyes are inherently shifty, or the advice that you can spot a lie-in-progress if the eyes venture to the corners, as if searching for cue cards splashed with advice on how best to proceed with the fiction. Studying the faces of habitual liars can certainly make one slightly more adept at picking out half-truths and outright fabrications. From about the age of 4, humans become skilled, convincing liars, and sometimes equally skilled at thinking they can spot a distortion. Fibs are swapped back and forth with such ease and frequency that most people have become accustomed to being regularly lied to. According to research conducted at the University of Southern California, the average person gets hit with a lie every five minutes, or about 200 times a day.
But physical reactions such as establishing or failing to hold eye contact can be suppressed, faked, or the result of any number of complicating factors, from extreme shyness to neurological problems. And the most accomplished liars are adept at manipulating certain facial expressions that would otherwise give away a mediocre liar. For the FBI or CIA to nab a suspected fabricator, they need a dependable test backed by computational data. In the three areas where polygraphs are most used—pre-employment testing, criminal investigation, and counterintelligence—the stakes are high.
All three of these uses have spurred researchers like Kircher and Hacker to pursue advanced lie-detection techniques. In the mid to late ’90s, Department of Energy scientist Wen Ho Lee was under suspicion for passing Trident warhead nuclear secrets to the Chinese. The federal government’s case against Lee unraveled after the FBI bungled his polygraph tests, and eventually most charges against Lee were dropped—but not before the cloud of suspicion and bad press effectively destroyed the scientist’s career. To avoid repeats of the Wen Ho Lee debacle, most government officials agreed it was high time to revisit polygraph testing and find a more effective method for detecting deception. “Congress has tasked the National Science Foundation to find out all they can about polygraphs and lie detection,” says Kircher.
A standard polygraph measures physiological responses to perceived stress, operating under the theory that lying makes a person nervous. A polygraph will track blood pressure, heart rate, respiration, and skin conductivity (in other words, sweating) on the fingertips. But the old-fashioned polygraph, the results of which are not admissible as evidence in many state courts, has had a checkered past. Critics often point to CIA agent Aldrich Ames, convicted of spying in 1994, who some say managed to successfully “trick” the polygraph multiple times. Or baseball pitcher Gaylord Perry, whose polygraph results showed he didn’t use spitballs, but who was later caught in the act. Scores of Web sites, organizations, and publications have also cropped up with supposed methods for “beating” a polygraph test. If anything, the field is ripe for innovation, a new way of finding liars and the tales they tell without relying on blood pressure, heart rate, respiration, and skin conductivity as indicators.
Kircher and Hacker’s idea for an eye-tracking system provides a glimpse into a person’s mind, offering a snapshot of the deceptive process at work by measuring a liar’s thought processes as closely as possible, short of an MRI scan—which some believe could offer clues to how the brain looks as it constructs a lie, but is far too impractical for everyday use. The act of reading text opens up a window into cognition, and tracking how people are reading goes beyond simply recording a response to stimuli—it’s a peek at the manipulation of ideas and the reconstruction of reality as the deceiver tries to maintain a lie. In essence, reading occurs on a deep psychological level that’s also tied into basic physical movements that are exceedingly difficult to control. It happens quickly, and perhaps without your knowing it. And a liar already has plenty of brain resources tied up, as Kircher and Hacker’s team found out.
Truth or Consequences
Hacker, an expert in the reading process and how we acquire, facilitate, and utilize reading skills, was familiar with eye-tracking hardware through his work. The head-mounted eye tracking machine that Hacker had used in reading studies looks like something you might see in the ophthalmologist’s office. It records where the eye falls on a given passage of text, indicating words or phrases that seem to be giving the wearer some trouble. For example, in this paragraph, the machine might indicate the reader’s repeated scanning of the word ophthalmologist. It tabulates how often the reader’s eye backtracks to the word, and how long it may linger there.
Hacker leaned on Kircher’s technical expertise in customizing the eye tracker and coming up with appropriate software. Kircher is a widely regarded expert in lie detection and computational analysis of data from polygraph systems. He has evaluated the merits of polygraph systems, analyzed shortcomings in them, and knows how they work, inside and out.
Hacker and Kircher also enlisted the help of cognitive psychologist and Assistant Professor Anne Cook, who has extensive expertise with eye-tracking technology applied to reading processes, Professor Dan Woltz, an expert on learning and memory who collaborated previously with Kircher on research on deception, and grad students Andrea Webb, Dahvyn Osher, and Sean Kristjansson. They tweaked and rigged the eye tracker, wrote code for the software, and spent months ironing out the wrinkles. With a seed grant from the U to get the research ball rolling, the team set up a “Mission: Impossible” experiment.
The experiment worked like this: A fake office was set up, arranged to look like any other university office. The team rounded up 20 research participants and instructed them to proceed to the office and ask for Dr. Williams. The secretary at the desk (not a real secretary, but someone helping with the study) would then inform them that there was no Dr. Williams. The participant thanked the secretary, left the office, and waited inconspicuously outside the office until the secretary left the scene. The participant then re-entered the office, rifled through her purse, and stole $20. Another 20 participants committed a different “crime.” They were instructed to slip a computer disk into a graduate student’s PC and download financial information. Kircher and Hacker were curious to find out if they could distinguish between the participants who had committed the cash crime and the information crime. An additional 40 innocent participants were told that they were suspected of committing one of the crimes, but that they should deny involvement in either one and that they should report to the reading laboratory for their test.
Afterward, the volunteers were fitted with the modified eye-tracking system and given a questionnaire composed of 144 statements, some pertaining to the cash crime (such as “True or False: I took the $20”), some pertaining to the information crime, and some of which were neutral questions (“Today is Thursday”), all in an effort to determine when subjects were lying and what they were lying about. “If someone was reading a statement that was incriminating, if they were lying, then we thought we’d likely see more fixations on those particular statements,” explains Hacker.
Since the art of deception requires an incentive, the research team had to offer something to spur lying and motivate the volunteer fibbers to do their best to skirt the truth. After all, liars do what they do to escape punishment, attain wealth or higher social status, make themselves seem more interesting or pleasing, or avoid unpleasant situations—such as incarceration. For the volunteers, there was no fear of punishment to motivate them, so something else was in order. “We offered them $30 extra if they could ‘trick’ the machine,” says Hacker. In other words, lie successfully, and get rewarded for it.
As the research participants worked their way through the questionnaire, the eye-tracking system measured total time spent on each question; eye fixation on certain words, phrases, or questions; time spent rereading passages; and changes in pupil diameter.
The U team indeed found that “guilty” individuals tended to fixate on questions concerning their crime, but most surprisingly, guilty subjects tended to spend more time (and by time, we’re talking fractions of a second) obsessively re-reading questions about the other crime, in a phenomenon the researchers dubbed the “crossover effect.”
“It’s as if it’s taking more cognitive effort to maintain a lie,” says Hacker, “so they’re being slowed by running across questions pertaining to another crime.” The effect is something like a cognitive speed bump. The increase in pupil diameter indicates that considerable brainpower is being used to create and maintain a falsehood to questions about the crime, so that the appearance of questions regarding another crime slows the reading process even further. Meanwhile, “innocent” individuals sped through questions about both crimes with hardly any fixations or excessive time re-reading questions.
All told, the team could finger liars 85 percent of the time, results on par with, or better than, conventional lie-detection techniques. Some studies of conventional polygraph tests using mock crimes have about an 80 percent success rate—or sometimes lower—so the eye-tracking system does, at this point, show promise. “What we’ve discovered here no one has ever done before,” says Kircher.
The device also measures respondents’ pupil dilation, sometimes an indicator of concentrative effort. The theory is that unspooling a half-truth or an outright fabrication takes far more mental effort than simply telling the truth, so it follows that a liar’s pupils would dilate. However, the phenomenon only makes itself apparent for a short period of time—two or three seconds—and levels off as respondents continue to work their way through the questionnaire. Like all lie-detection systems, Kircher and Hacker’s eye tracker must rely on several variables factored together, not just a single element. There is no single characteristic that would pinpoint who pocketed the 20 bucks from the secretary’s desk.
You Say Tomato, I Say Polygraph
Recent trends in lie detection attempt to answer these charges by emphasizing computer analysis of results, taking the ever-so-fallible human equation out of the mix. With old-fashioned polygraphs, two experts peering at polygraph recordings may arrive at different conclusions. One study found that an examiner revisiting the same data six months later may arrive at a different conclusion 10 to 20 percent of the time. This is where Kircher and Hacker’s eye-tracking system really shines: It relies on the computer to sort and analyze much of the information.
But who is willing to buy such a gadget? Turns out, plenty of organizations. And not just law enforcement seeking another technological tool to collar criminals.
Although most industries cannot use polygraphs for pre-employment testing because the Employee Polygraph Protection Act of 1988 restricts an employer’s use of a lie detector to determine honesty or dishonesty (except in cases of suspected theft or sabotage), some high-risk firms, such as those in security and pharmaceutical industries, may use them regularly in pre-employment situations. But the biggest customer could be federal and state governments, who routinely rely on polygraphs to screen applicants. The FBI, Secret Service, and DEA use pre-employment polygraph examinations, and there have been calls to update the test with something better and more accurate.
The federal government is willing to spend a lot of money for accuracy. Just recently, the U.S. Department of Homeland Security granted Rutgers University $3.5 million to aid researchers in developing dependable computer-based lie detectors.
Now, The U of U team is looking to the National Science Foundation and others for additional funding. Cook, Webb, and Kircher recently returned from Washington, D.C., where they presented data about the eye tracker, hoping to catch the eye of the Department of Defense, among others.
But there is still a long way to go—more presentations before committees, and more grants. Yet Kircher is optimistic. “With sufficient funding, we could get this out in three to five years,” he says.
If so, federal agencies, law enforcement, and certain industries may be knocking on doors at the U with a few questions of their own.
—Jason Matthew Smith is editor of Continuum.
|Go to Continuum Archives :: U Disclaimer :: Send Comments|