Advertisement

The Creepy Side Of Emotion Recognition Technology

The source claimed the software is designed for “pre-judgement without any credible evidence”.
Emotion recognition

According to a BBC report, China is testing emotion recognition tech on Uyghurs in Xinjiang. The BBC’s source–a software engineer who claimed to have installed the camera systems in police stations in the province- said the Chinese government uses Uyghurs as test subjects for various experiments.

The unnamed source said the emotion detection camera — placed 3m from the subject–is trained to detect the slightest changes in facial expressions and skin pores. The source said it is similar to a lie detector but far more advanced. The software outputs a pie chart, and the red segment shows an anxious state of mind.

The source claimed the software is designed for “pre-judgement without any credible evidence”.

Last year, True Light College, a Hong Kong-based secondary school for girls, made headlines for using a surveillance system in online classes. The school used software called 4 Little Trees, an AI-based program that could read children’s emotions as they learn. The algorithm measures micro-movements of facial muscles to identify emotions such as happiness, sadness, anger, fear, or surprise. As per the founder of 4 Little Trees, up to 83 schools in Hong Kong have adopted this system in the past year.

Many firms are working on similar emotion recognition technology–an extension of facial recognition technology. Google and Microsoft use basic emotion analysis. Amazon claimed that its Rekognition facial recognition software assesses up to eight emotions, including fear. The company cited retail as one of the uses cases, saying stores can use the emotion recognition tech to understand the behaviour patterns of shoppers and identify trends early from the live camera feed. 

In the past, Disney has used a software to test volunteers’ reaction to its films like Star Wars: The Force Awakens and Zootopia. Automobile companies such as Ford, BMW and Kia Motors use emotion recognition software to check drivers’ alertness.

Challenges

According to the Market And Market report, the global market for emotion detection technology was worth $19.5 billion in 2020 and is expected to grow to $37.1 billion in 2026, at a CAGR of 11.3 percent during the forecast period. The higher adoption of IoT, AI, machine learning, and deep learning technologies, growing demand in the automation industry, and the need for socially intelligent artificial agents are the major drivers of growth, the report said.

Meanwhile, New York-based research centre AI Now Institute said in a 2019 report that emotion-detection tech is based on a ‘markedly shaky foundation’.

AI experts say emotion recognition systems are based on the assumption that humans manifest emotions in similar ways. Something as simple as a raised eyebrow may have different meanings in different cultures. Luke Stark, assistant professor in the faculty of information and media studies at the University of Western Ontario, said in an interview, “Emotions are simultaneously made up of physiological, mental, psychological, cultural, and individually subjective phenomenological components. No single measurable element of an emotional response is ever going to tell you the whole story. Philosopher Jesse Prinz calls this “the problem of parts.”

In a recent essay for Nature, Professor Kate Crawford said many such algorithms are based on psychologist Paul Ekman’s study, conducted in 1960s, on nonverbal behaviour. According to him, there are six basic emotions– happiness, sadness, fear, anger, surprise and disgust. Ekman’s work and ideas have formed the basis for emotion-detection technologies used by giants such as Microsoft, IBM, and Amazon. “Six consistent emotions could be standardised and automated at scale — as long as the more complex issues were ignored,” Prof Crawford contests. She said such technology has not been independently audited for its effectiveness. 

Ekman’s study underplays the importance of cultural context. It assumes a definitive correlation between facial expressions and a person’s emotional state, despite studies indicating that facial expressions are not reliable indicators of emotion.

Download our Mobile App

Shraddha Goled
I am a technology journalist with AIM. I write stories focused on the AI landscape in India and around the world with a special interest in analysing its long term impact on individuals and societies. Reach out to me at shraddha.goled@analyticsindiamag.com.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Upcoming Events

15th June | Online

Building LLM powered applications using LangChain

17th June | Online

Mastering LangChain: A Hands-on Workshop for Building Generative AI Applications

Jun 23, 2023 | Bangalore

MachineCon 2023 India

26th June | Online

Accelerating inference for every workload with TensorRT

MachineCon 2023 USA

Jul 21, 2023 | New York

Cypher 2023

Oct 11-13, 2023 | Bangalore

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR