Development of a Universal Validation Protocol and an Open-Source Database for Multi-Contextual Facial Expression Recognition

Facial expression recognition (FER) poses a complex challenge due to diverse factors such as facial morphology variations, lighting conditions, and cultural nuances in emotion representation. To address these hurdles, specific FER algorithms leverage advanced data analysis for inferring emotional st...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 20; p. 8376
Main Authors La Monica, Ludovica, Cenerini, Costanza, Vollero, Luca, Pennazza, Giorgio, Santonico, Marco, Keller, Flavio
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 10.10.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Facial expression recognition (FER) poses a complex challenge due to diverse factors such as facial morphology variations, lighting conditions, and cultural nuances in emotion representation. To address these hurdles, specific FER algorithms leverage advanced data analysis for inferring emotional states from facial expressions. In this study, we introduce a universal validation methodology assessing any FER algorithm’s performance through a web application where subjects respond to emotive images. We present the labelled data database, FeelPix, generated from facial landmark coordinates during FER algorithm validation. FeelPix is available to train and test generic FER algorithms, accurately identifying users’ facial expressions. A testing algorithm classifies emotions based on FeelPix data, ensuring its reliability. Designed as a computationally lightweight solution, it finds applications in online systems. Our contribution improves facial expression recognition, enabling the identification and interpretation of emotions associated with facial expressions, offering profound insights into individuals’ emotional reactions. This contribution has implications for healthcare, security, human-computer interaction, and entertainment.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Current address: National Research Council, Institute of Cognitive Sciences and Technologies, 00185 Rome, Italy.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23208376