FEAD: Introduction to the fNIRS-EEG Affective Database - Video Stimuli
This article presents FEAD, a fNIRS-EEG Affective Database that can be used for training emotion recognition models. The electrical activity and brain hemodynamic responses of 37 participants were recorded, as well as the categorical and dimensional emotion ratings they gave to 24 affective audio-vi...
Saved in:
Published in | IEEE transactions on affective computing Vol. 16; no. 1; pp. 15 - 27 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.01.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This article presents FEAD, a fNIRS-EEG Affective Database that can be used for training emotion recognition models. The electrical activity and brain hemodynamic responses of 37 participants were recorded, as well as the categorical and dimensional emotion ratings they gave to 24 affective audio-visual stimuli. The relationship between the neurophysiological signals with the subjective ratings was investigated, with a significant correlation found in the prefrontal cortex region. A binary classification of affective states was performed using a subject-dependent approach, taking into account the fusion of both modalities, functional Near-Infrared Spectroscopy and Electroencephalography, and each single modality separately. In addition, we explored the temporal dynamics of the recorded data in shorter trials and found that the fusion of features from both modalities yielded significantly better results than using a single modality. This database will be made publicly available with the aim to encourage researchers to develop more advanced algorithms for affective computing and emotion recognition. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1949-3045 1949-3045 |
DOI: | 10.1109/TAFFC.2024.3407380 |