The SenseEmotion Database: A Multimodal Database for the Development and Systematic Validation of an Automatic Pain- and Emotion-Recognition System

In our modern industrial society the group of the older (generation 65+) is constantly growing. Many subjects of this group are severely affected by their health and are suffering from disability and pain. The problem with chronic illness and pain is that it lowers the patient’s quality of life, and...

Full description

Saved in:
Bibliographic Details
Published inMultimodal Pattern Recognition of Social Signals in Human-Computer-Interaction Vol. 10183; pp. 127 - 139
Main Authors Velana, Maria, Gruss, Sascha, Layher, Georg, Thiam, Patrick, Zhang, Yan, Schork, Daniel, Kessler, Viktor, Meudt, Sascha, Neumann, Heiko, Kim, Jonghwa, Schwenker, Friedhelm, André, Elisabeth, Traue, Harald C., Walter, Steffen
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In our modern industrial society the group of the older (generation 65+) is constantly growing. Many subjects of this group are severely affected by their health and are suffering from disability and pain. The problem with chronic illness and pain is that it lowers the patient’s quality of life, and therefore accurate pain assessment is needed to facilitate effective pain management and treatment. In the future, automatic pain monitoring may enable health care professionals to assess and manage pain in a more and more objective way. To this end, the goal of our SenseEmotion project is to develop automatic pain- and emotion-recognition systems for successful assessment and effective personalized management of pain, particularly for the generation 65+. In this paper the recently created SenseEmotion Database for pain- vs. emotion-recognition is presented. Data of 45 healthy subjects is collected to this database. For each subject approximately 30 min of multimodal sensory data has been recorded. For a comprehensive understanding of pain and affect three rather different modalities of data are included in this study: biopotentials, camera images of the facial region, and, for the first time, audio signals. Heat stimulation is applied to elicit pain, and affective image stimuli accompanied by sound stimuli are used for the elicitation of emotional states.
ISBN:9783319592589
3319592580
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-59259-6_11