Employers go through all kinds of hoops and hurdles to try to ensure the integrity of job candidates. Whether it’s reference checks, credit checks or psychometric tests, they spare no expense to get to the truth. But is that enough?
What if they had technology that could gauge, with considerable accuracy, whether a person was being honest about their emotions and telling the truth?
A joint study by researchers at the University of California, San Diego, the University at Buffalo and the University of Toronto found that a computer-vision system can distinguish between real and faked expressions of pain more accurately than humans.
As a result, they say the system could also be used to detect deceptive actions in the realms of security, psychopathology, medicine, law — and job screening.
In highly social species such as humans, faces have evolved to convey rich information, including expressions of emotion and pain, says Kang Lee, a professor at the University of Toronto involved with the study.
"And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing so successfully that they fool other people. The
computer is much better at spotting the subtle differences between involuntary and voluntary facial movements."
The study involved two experiments with a total of 205 human observers who were asked to assess the veracity of expressions of pain in video clips of individuals, with some having their hands immersed in ice water to measure pain tolerance, while others faked painful expressions.
The human observers could not distinguish real expressions of pain from faked expressions of pain better than chance. Even after they were trained, their accuracy only improved to 55 per cent compared to the computer, which attained 85 per cent accuracy.
"Dynamic motion, the movement of muscles when you’re actually experiencing emotion, is different than the smoothness of the actions when you’re posing the emotion. The reason for this is signals for those expressions originate in different parts of the brain," says Mark Frank, professor of communication at the University at Buffalo, who was involved with the study.
And while humans are good at detecting the presence or absence of a smile, for example, they can’t necessarily tell the difference between a fake smile and a real one.
"The types of movement were fairly similar in the real and fake pain, but the key discriminator was the flow of the movement, which machines pick up pretty easily, where humans (don’t)," he says, citing other studies.
Once the mathematical model is developed, the HR applications of this tool could include recruitment, in terms of screening job candidates, or workers’ compensation, when it comes to doctors discerning if employees are truly experiencing pain, says Lee.
"I’m pretty sure the accuracy rates will be far better than us, like HR people, sitting in front of the interviewee and looking at their facial expressions."
Any tool that helps with the screening process is appreciated, says Trish Dehmel, managing director of CSI, an investigative services company, in Halifax.
"We believe the lie detector or the polygraph... because it’s a proven technology, a proven tool that can be used to determine truth. Now, if this computer tool was as good as that, then certainly it would be extremely beneficial when you are interviewing someone."
But jurisdictions such as Ontario prohibit the use of something that’s equivalent to a lie detector in employment, according to Christina Hall, a partner in the labour and employment law group at Fasken Martineau in Toronto.
"Most would see that as an invasion of personal privacy," she says. "In Ontario, employers are prohibited from asking for it and employees are granted the right to refuse to engage in that. And, of course, there’s remedies under the act if an employer violates, which could be compensation. It could also be an order to actually hire or reinstate someone."
Employers in other jurisdictions with privacy legislation, such as British Columbia and Quebec — as well as federally regulated employers covered by the Personal Information Protection and Electronics Documents Act (PIPEDA) — would struggle to justify that type of collection as being reasonable for the purposes of assessing somebody for employment, says Hall.
"There’s less invasive measures at your fingertips to arguably get at the same information."
The technology does have privacy issues, says Lee, so it would require some kind of consent from the interviewee. But even the act of asking for permission could act as a deterrent, he says.
But privacy analysis is a multi-step process where an employer has to prove a number of things, says Hall.
"Just getting somebody’s consent isn’t sufficient if what you’re doing isn’t reasonable in the first place, so getting consent is only one element of a larger package to prove compliance with privacy legislation."
To defend the use of such technology, an employer would have to prove it was accurate and reliable through scientific evidence, says Hall.
"All of that is a lot of effort to go through when, arguably, you have other, more traditional, more accepted forms of background checking at your fingertips that you could perhaps enhance or strengthen to get at the kinds of information that you’re looking at without opening the door to, frankly, the legal quagmire that you would let in if you started using this kind of technology — never mind the message it’s sending to people sitting across the table."
It’s not exactly the most welcoming message an employer can send in an interview process, she says, "putting somebody under, quite literally, some form of electronic microscope."
But it’s so hard to hire the right people now and employers need all the help they can get, says Dehmel.
"You have to remember to weight all your tools, so that you don’t give one more weight perhaps than it deserves. It’s still a game of interviewer and the interviewee and how they connect in that interview, it really is."