Dynamic Analysis of Automatic Emotion Recognition Using Generalized Additive Mixed Models

Damien Dupre, Adam Booth, Andrew Bolster, Gawain Morrison, Gary McKeown

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)
947 Downloads (Pure)

Abstract

The increasing ease of access to large participant populations via online recruitment platforms offers considerable promise to study aspects of emotion and participants’ relationship to brands, artistic media and advertising content. Large participant samples can be quickly and easily gathered, and their facial expressions can be assessed using video captured by webcams that are ubiquitous in modern computing environments. However, a long-standing issue is the degree of reliability of the data provided by automatic facial expression recognition systems. We evaluated the facial expressions of 836 participants triggered by an online sad video. Thanks to the use of this sizeable sample, this experiment has a high statistical power and showed, as expected, that Sadness expressions are significantly more frequently recognized compared to other facial expressions (i.e. Happiness, Surprise and Disgust). However, by using Generalized Additive Mixed Models (GAMM) to analyse the time-series we show that dynamic statistical analyses can avoid biased interpretations. Technical sensitivity, posture interpretation, emotion idiosyncrasy, intensity of spontaneous expressions and social context of the emotion elicitation are open questions that must be answered to perform reliable online facial expression recognition.
Original languageEnglish
Title of host publicationSymposium on Computational Modelling of Emotion: Theory and Applications
Pages158-163
Number of pages6
Publication statusPublished - 18 Apr 2017

Fingerprint Dive into the research topics of 'Dynamic Analysis of Automatic Emotion Recognition Using Generalized Additive Mixed Models'. Together they form a unique fingerprint.

Cite this