Improving Facial Emotion Recognition Model in Social Robot Using Graph-Based Techniques with 3D Face Orientation

In today's era of Human-Robot Interaction (HRI), the ability of robots to understand and respond to human emotions is crucial. Facial Expression Recognition (FER) plays an important role in enabling social robots to engage naturally with humans. However, existing FER systems face challenges suc...

Full description

Saved in:
Bibliographic Details
Published in2024 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) pp. 234 - 237
Main Authors Laohakangvalvit, Tipporn, Subsa-ard, Nopphakorn, Fulini, Felipe Yudi, Suzuki, Kaoru, Sugaya, Midori
Format Conference Proceeding
LanguageEnglish
Published IEEE 15.09.2024
Subjects
Online AccessGet full text
DOI10.1109/ACIIW63320.2024.00050

Cover

Loading…
More Information
Summary:In today's era of Human-Robot Interaction (HRI), the ability of robots to understand and respond to human emotions is crucial. Facial Expression Recognition (FER) plays an important role in enabling social robots to engage naturally with humans. However, existing FER systems face challenges such as computational complexity and sensitivity to facial orientation, which limit their practical effectiveness. This paper proposes a novel approach using 3D face orientation to enhance the accuracy and robustness of FER models on social robots. We demonstrated our methodology using the FER2013 dataset, employing graph-based facial landmarks and a lightweight Artificial Neural Network (ANN) architecture. Our results show significant improvements in classifying negative emotions, while also highlighting areas for further refinement across a range of emotional states.
DOI:10.1109/ACIIW63320.2024.00050