A.I. Reads Human Emotions (Without Seeing Faces)

Past advancements in emotion recognition technology (i.e. machines reading human emotions) were limited to facial and voice analysis. That has changed with two recent papers published by researchers in the U.S., South Korea and Switzerland.

One team used deep learning to analyze context clues of a situation to develop a more accurate analysis of human emotions. Context clues include hand movements, the person someone is talking to, and their environment. The other study built a novel method of analyzing human body language and classifying a person’s walking style as one of four emotions: happy, sad, angry or neutral.

Both teams tout the potential for this technology, and we understand that. It could be used by doctors to better identify underlying mental conditions, or it could identify people at risk of suicide by how they walk. But there’s also potential for overreach.

The U.S. TSA is already testing facial recognition software that matches airline passengers’ faces with their IDs. It’s not a stretch to imagine a future where this program is expanded to use body language and context clue analysis as well. It’s not out of the realm of possibility that private industries like grocery stores and taxi cab companies employ the technology too.

Past research has shown that surveillance like this increases stress, anxiety and depression among humans. And this eerily mirrors the world George Orwell warned about in his novel 1984. In that story, people were monitored in their homes by telescreens that analyzed body language, facial movements and speech patterns to find “thought criminals.”

We’re not there yet, but the public needs to be aware of these developments so it can consider regulating emotion recognition before mass adoption.

“Identifying Emotions from Walking Using Affective and Deep Features,” Randhavane et al. (2019):

“Context-Aware Emotion Recognition Networks,” Lee et al. (2019):

“Employee stress and health complaints in jobs with and without electronic performance monitoring,” Smith et al. (1992):

It’s Bloody Science! LLC created the text, music and audio of this video. All images, sounds and video clips are freely available in the public domain or Creative Commons licenses. No copyright material, audio, music, images, animations or videos, have been used in this production.

Big Brother image via Boris U. via Creative Commons license here: