Classification of affects using head movement, skin color features and physiological signals
The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users'...
Saved in:
Published in | 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC) pp. 2664 - 2669 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.10.2012
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users' affective states from a fusion model of facial videos and physiological measures. The natural behavior expressed on faces and their physiological responses were recorded from subjects (N=20) while they viewed images from the International Affective Picture System (IAPS). This paper provides a direct comparison between user-dependent, gender-specific, and combined-subject models for affect classification. The analysis indicates that the accuracy of the fusion model (head movement, facial color, and physiology) was statistically higher than the best individual modality for spontaneous affect expressions. |
---|---|
ISBN: | 9781467317139 1467317136 |
ISSN: | 1062-922X |
DOI: | 10.1109/ICSMC.2012.6378149 |