Gesture-Based Affective Computing on Motion Capture Data

This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points...

Full description

Saved in:
Bibliographic Details
Published inAffective Computing and Intelligent Interaction pp. 1 - 7
Main Authors Kapur, Asha, Kapur, Ajay, Virji-Babul, Naznin, Tzanetakis, George, Driessen, Peter F.
Format Book Chapter
LanguageEnglish
Published Berlin, Heidelberg Springer Berlin Heidelberg 2005
SeriesLecture Notes in Computer Science
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental results with different machine learning techniques show that automatic classification of this data ranges from 84% to 92% depending on how it is calculated. In order to put these automatic classification results into perspective a user study on the human perception of the same data was conducted with average classification accuracy of 93%.
ISBN:3540296212
9783540296218
ISSN:0302-9743
1611-3349
DOI:10.1007/11573548_1