Sensing emotional valence and arousal dynamics through automated facial action unit analysis

Information about the concordance between dynamic emotional experiences and objective signals is practically useful. Previous studies have shown that valence dynamics can be estimated by recording electrical activity from the muscles in the brows and cheeks. However, whether facial actions based on...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 14; no. 1; pp. 19563 - 15
Main Authors Zhang, Junyao, Sato, Wataru, Kawamura, Naoya, Shimokawa, Koh, Tang, Budu, Nakamura, Yuichi
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 22.08.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Information about the concordance between dynamic emotional experiences and objective signals is practically useful. Previous studies have shown that valence dynamics can be estimated by recording electrical activity from the muscles in the brows and cheeks. However, whether facial actions based on video data and analyzed without electrodes can be used for sensing emotion dynamics remains unknown. We investigated this issue by recording video of participants’ faces and obtaining dynamic valence and arousal ratings while they observed emotional films. Action units (AUs) 04 (i.e., brow lowering) and 12 (i.e., lip-corner pulling), detected through an automated analysis of the video data, were negatively and positively correlated with dynamic ratings of subjective valence, respectively. Several other AUs were also correlated with dynamic valence or arousal ratings. Random forest regression modeling, interpreted using the SHapley Additive exPlanation tool, revealed non-linear associations between the AUs and dynamic ratings of valence or arousal. These results suggest that an automated analysis of facial expression video data can be used to estimate dynamic emotional states, which could be applied in various fields including mental health diagnosis, security monitoring, and education.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-70563-8