Imagine if lie-detection apps became widely available on everyone鈥檚 phones or computers. Merely by processing a video of what you are saying to someone鈥揵y analyzing your facial expressions, body movements, auditory details, and the semantic content of your speech鈥攖he apps could use data science to attempt to predict if you are lying or telling the truth about who you voted for, your salary, your fidelity, and the list goes on.
鈥淲hat will that do to society?鈥 asks Ehsan Hoque. 鈥淲hat will it do to our relationships?鈥
So, even as the 人妻少妇专区 computer scientist creates ground-breaking computer technology to detect lies for a host of beneficial medical and public safety purposes, he also wants to help ensure the technology is used constructively鈥攁nd ethically.
He鈥檒l have the chance to pursue both his goals鈥攄eveloping the technology and establishing a framework for its appropriate use鈥攚ith a $1 million (ECASE-ARMY) from the Army Research Office (ARO).
ECASE awards are the highest honor bestowed by the ARO to outstanding scientists and engineers beginning their independent careers. The awards recognize researchers who pursue innovative research at the frontiers of science and technology and who also engage in scientific leadership, education, or community outreach.

鈥淭his is a big boost that will help my lab continue to pursue blue-sky research,鈥 says Hoque, an assistant professor of and the Asaro Biggar Family Fellow in Data Science. He is also interim director of the .
鈥淭he Army Research Office encourages young scientists to take on ridiculously difficult problems and allow ample intellectual freedom to pursue them,鈥 Hoque says. 鈥淚t is a wonderful opportunity for young scientists like myself to take on high risks early in their career.鈥
The technology Hoque hopes to develop with this award builds upon his lab鈥檚 recent work on . His (ROC-HCI) Lab used data science and created an online crowdsourcing framework called ADDR (Automated Dyadic Data Recorder) to build the largest publicly available deception dataset so far 鈥 and to discover, for example, why some smiles are more deceitful than others.
Applications include:
- improved security screening of passengers in airports;
- helping medical practitioners realize when patients are hiding suicidal symptoms;
- new ways for law-enforcement agents to assess risk when engaging in self-defense;
- and helping individuals with autism realize when they are being manipulated or deceived.
Making interrogations more objective
With the new award, Hoque and his lab will develop machine-learning algorithms that could make the interrogations and interviews used in these settings more objective.
A typical interrogation is conducted in distinct phases, starting with general questions to establish rapport.
鈥淗ow someone answers in the rapport-building phase could influence how the interviewer looks at that person when the important, relevant questions are asked in the subsequent phases, biasing the interviewer鈥檚 ability to tell if someone is lying or telling the truth,鈥 Hoque says.
鈥淪o, imagine as I鈥檓 interviewing you, I have a computer that is helping me out, that can treat each phase as unique and objective,鈥 Hoque says. 鈥淲e can build algorithms to help quantify some of the nuances that an interviewer might miss鈥搒ubtle inconsistencies among facial cues, what is said, how it is said.鈥
At the end of each phase, the algorithm, trained with reinforcement learning, would recommend whether it has 鈥渟een鈥 enough to recommend going to the聽 next phase, or whether additional questions should be asked.
The interviewer would still be in charge and still make the decisions, Hoque emphasizes, but 鈥渢he algorithm is providing independent, quantifiable metrics, so that the interrogators can further quantify their decisions. It adds objectivity and transparency to the interrogation process.鈥
Greater objectivity could benefit both the interviewer and the person being interviewed, especially in security and law enforcement settings, he says.
Legal and ethical issues of lie-detection technology
But Hoque worries about what could happen when the technology becomes widely available. A possible worst case scenario: Mandatory use of the technology during job interviews.
鈥淲e have talked with people who say they would never consider lying 聽on job interviews but would be unwilling to do an interview if a camera was turned on them with lie-detecting technology,鈥 Hoque says. 鈥淚t鈥檚 the comfort factor. So what kind of protection would be available for these people?鈥
To explore the legal and ethical issues that might arise, his team will identify the ways the technology could be misused and recommend a framework for use aimed at the maximum net gain for society.
Hoque will collaborate with Mark Frank, a professor of communication at the University at Buffalo who has more than 25 years of experience studying deception with US law enforcement agencies.
鈥淚 have a fairly well-developed eye for work that would be not just scientifically sound, but also of clear utility, especially in law enforcement and national security settings,鈥 Frank says. 鈥淚 strongly believe Dr. Hoque is on the right track here to make some practical breakthroughs.鈥
Technologies that enhance human ability using data science
鈥淧rofessor Hoque is considered a pioneer in designing and validating AI technologies that can improve human ability,鈥 a Department of Defense fact sheet notes.
For example, as a PhD student at MIT, Hoque developed an app called 鈥揳 3D character that can automatically play the role of a recruiter, conduct a job interview, and provide feedback. The app provided the first scientific evidence that it is possible for humans to improve their face-to-face interpersonal skills by interacting with a computer.
Hoque has since applied this framework across multiple domains: , , , and , among others
His previous honors include a National Science Foundation CAREER award and selection as one of Science News鈥檚 and MIT Technology Review鈥檚 .
He is also an inaugural member of the ACM Future of Computing Academy, established to guide and empower the next generation of computing leaders.
Read more

University researchers are using data science to analyze more than 1 million facial expressions to more accurately detect deception based on a smile.

Public speaking in the number one reported fear of American. The ROCspeak platform is a powerful technology tool to fight the fear.