Facial expression analysis with AFFDEX and FACET: A validation study
The goal of this study was to validate AFFDEX and FACET, two algorithms classifying emotions from facial expressions, in iMotions’s software suite. In Study 1, pictures of standardized emotional facial expressions from three databases, the Warsaw Set of Emotional Facial Expression Pictures (WSEFEP),...
Saved in:
Published in | Behavior research methods Vol. 50; no. 4; pp. 1446 - 1460 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.08.2018
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The goal of this study was to validate AFFDEX and FACET, two algorithms classifying emotions from facial expressions, in iMotions’s software suite. In Study 1, pictures of standardized emotional facial expressions from three databases, the
Warsaw Set of Emotional Facial Expression Pictures
(WSEFEP), the
Amsterdam Dynamic Facial Expression Set
(ADFES), and the
Radboud Faces Database
(RaFD), were classified with both modules. Accuracy (
Matching Scores)
was computed to assess and compare the classification quality. Results show a large variance in accuracy across emotions and databases, with a performance advantage for FACET over AFFDEX. In Study 2, 110 participants’ facial expressions were measured while being exposed to emotionally evocative pictures from the
International Affective Picture System
(IAPS), the
Geneva Affective Picture Database
(GAPED) and the
Radboud Faces Database
(RaFD). Accuracy again differed for distinct emotions, and FACET performed better. Overall, iMotions can achieve acceptable accuracy for standardized pictures of prototypical (vs. natural) facial expressions, but performs worse for more natural facial expressions. We discuss potential sources for limited validity and suggest research directions in the broader context of emotion research. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1554-3528 1554-3528 |
DOI: | 10.3758/s13428-017-0996-1 |