Classification of affects using head movement, skin color features and physiological signals

The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users'...

Full description

Saved in:
Bibliographic Details
Published in2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC) pp. 2664 - 2669
Main Authors Monkaresi, H., Hussain, M. S., Calvo, R. A.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2012
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users' affective states from a fusion model of facial videos and physiological measures. The natural behavior expressed on faces and their physiological responses were recorded from subjects (N=20) while they viewed images from the International Affective Picture System (IAPS). This paper provides a direct comparison between user-dependent, gender-specific, and combined-subject models for affect classification. The analysis indicates that the accuracy of the fusion model (head movement, facial color, and physiology) was statistically higher than the best individual modality for spontaneous affect expressions.
ISBN:9781467317139
1467317136
ISSN:1062-922X
DOI:10.1109/ICSMC.2012.6378149