Science & Tech

Scientists are trying to expose risks of AI emotion recognition to boost public debate

AI emotion recognition technology

Scientists are inviting people to pull faces at their webcam and smartphone to boost the public debate around controversial emotion recognition technology.

Researchers from Cambridge University and UCL have built a website called Emojify to help people to understand how computers can be used to scan facial expressions to detect emotion through what is know as artificial intelligence emotion recognition.

Dr Alexa Hagerty, project lead and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence, said the technology, which is already used in parts of the world and designed to identify human emotions using machine-learning algorithms, is “powerful” but “flawed”.

Visitors to the website can play a game, pulling faces at their device’s camera to try to get the emotion recognition system to recognise six emotions – happiness, sadness, fear, surprise, disgust and anger.

They can also answer a series of optional questions to assist research, including whether they have experienced the technology before and if they think it is useful or concerning.

Read more

AI emotion recognition technology is in use across a variety of sectors in China including for police interrogation and to monitor behaviour in schools. Other potential uses include in border control, assessing candidates during job interviews and for businesses to collect customer insights.

The researchers say they hope to start conversations about the technology and its social impacts.

Dr Hagerty said: “Many people are surprised to learn that emotion recognition technology exists and is already in use.

“Our project gives people a chance to experience these systems for themselves and get a better idea of how powerful they are, but also how flawed.”

Dr Igor Rubinov, of Dovetail Labs, a consultancy specialising in technology ethics, who directed the design of the interactive research website, said: “We want people to interact with an emotion recognition system and see how AI scans their faces and what it might get wrong.”

Juweek Adolphe, head designer, said: “It is meant to be fun but also to make you think about the stakes of this technology.”

Dr Hagerty said the technology has “worrying potential for discrimination and surveillance”.

She went on: “The science behind emotion recognition is shaky. It assumes that our facial expressions perfectly mirror our inner feelings.

“If you’ve ever faked a smile, you know that it isn’t always the case.”

Dr Alexandra Albert, of the Extreme Citizen Science (ExCiteS) research group at UCL, said a “more democratic approach” is needed to determine how the technology is used.

“There hasn’t been real public input or deliberation about these technologies,” she said.

“They scan your face, but it is tech companies who make the decisions about how they are used.”

The researchers said their website does not collect or save images or data from the emotion system.

The optional responses to questions will be used as part of an academic paper on citizen science approaches to better understand the societal implications of emotion recognition.

To try the artificial intelligence emotion recognition technology, see

The Conversation (0)