An Open Dataset for Impression Recognition from Multimodal Bodily Responses

We present a dataset (IMPRESSION) for multi-modal recognition of impressions on individuals and dyads. Compared to other databases, we did not only elicit impression using video stimuli, but also recorded natural impression formation of strangers meeting for the first time through video call. The da...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Affective Computing and Intelligent Interaction and workshops pp. 1 - 8
Main Authors Wang, Chen, Chanel, Guillaume
Format Conference Proceeding
LanguageEnglish
Published IEEE 28.09.2021
Subjects
Online AccessGet full text
ISSN2156-8111
DOI10.1109/ACII52823.2021.9597421

Cover

Loading…
More Information
Summary:We present a dataset (IMPRESSION) for multi-modal recognition of impressions on individuals and dyads. Compared to other databases, we did not only elicit impression using video stimuli, but also recorded natural impression formation of strangers meeting for the first time through video call. The database allows machine learning studies on impression recognition, using multimodal signals of individuals in relation to their emotion expressivity, and with respect to the interlocutor's reactions. The experiment setup was arranged with 62 participants' synchronized recordings of face videos, audio signals, eye gaze data, and peripheral nervous system physiological signals (Electrocardiogram-ECG, Blood Volume Pulse-BVP and Galvanic Skin Response-GSR) using wearable sensors. Participants reported their formed impressions in the W & C dimensions in real-time. We present the database in detail as well as baseline methods and results for impression recognition in W & C.
ISSN:2156-8111
DOI:10.1109/ACII52823.2021.9597421