New research shows Artificial Intelligence still lags behind humans when it comes to recognising emotions – Dublin City University

New DCU led research into the accuracy of artificial intelligence when it comes to reading emotions on our faces has shown that it still lags behind human observers when it comes to being able to tell whether were happy or sad. The difference was particularly pronounced when it came to spontaneous displays of emotion.

The recently published study, A performance comparison of eight commercially available automatic classifiers for facial affect recognition, looked at eight out of the box automatic classifiers for facial affect recognition (artificial intelligence that can identify human emotions on faces) and compared their emotion recognition performance to that of human observers.

It found that the human recognition accuracy of emotions was 72% whereas among the artificial intelligence tested, the researchers observed a large variance in recognition accuracy, ranging from 48% to 62%.

The work was conducted by Dr. Damien Dupr from Dublin City Universitys Business School, Dr. Eva Krumhuber from the Department of Experimental Psychology at UCL, Dr. Dennis Kster from the Cognitive Systems Lab, University of Bremenand Dr. Gary J. McKeown from the Department of Psychology at Queens University Belfast.

Key data points

How the study was done

Two well-known dynamic facial expression databases were chosen: BU-4DFE from Binghamton University in New York and the other from The University of Texas in Dallas.

Both are annotated in terms of emotion categories, and contain either posed or spontaneous facial expressions. All of the examined expressions were dynamic to reflect the realistic nature of human facial behavior.

To evaluate the accuracy of emotion recognition, the study compared the performance achieved by human judges with those of eight commercially available automatic classifiers.

Dr. Damien Dupr said

AI systems claiming to recognise humans emotions from their facial expressions are now very easy to develop. However, most of them are based on inconclusive scientific evidence that people are expressing emotions in the same way.

For these systems, human emotions come down to only six basic emotions, but they do not cope well with blended emotions.

Companies using such systems need to be aware that the results obtained are not a measure of the emotion felt, but merely a measure of how much ones face matches with a face supposed to correspond to one of these six emotions."

Co-author Dr. Eva Krumhuber from UCL added

AI has come a long way in identifying peoples facial expressions, but our research suggests that there is still room for improvement in recognising genuine human emotions.

Dr. Krumhuber recently led a separate study published in Emotion (also involving Dr. Kster) comparing human vs. machine recognition across fourteen different databases of dynamic facial expressions.

Researchers

Dr. Damien Dupr - Business School, Dublin City University

Dr. Eva Krumhuber - Department of Experimental Psychology, UCL

Dr. Dennis Kster - Cognitive Systems Lab, University of Bremen

Dr. Gary J. McKeown - Department of Psychology, Queens University Belfast

Photo byAndrea PiacquadiofromPexels

Read the original post:
New research shows Artificial Intelligence still lags behind humans when it comes to recognising emotions - Dublin City University

Related Posts
This entry was posted in $1$s. Bookmark the permalink.