Graph-Embedded Convolutional Neural Network for Image-Based EEG Emotion Recognition
Emotion recognition from electroencephalograph (EEG) signals has long been essential for affective computing. In this article, we evaluate EEG emotion recognition by converting EEG signals from multiple channels into images such that richer spatial information can be considered and the question of E...
Saved in:
Published in | IEEE transactions on emerging topics in computing Vol. 10; no. 3; pp. 1399 - 1413 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Emotion recognition from electroencephalograph (EEG) signals has long been essential for affective computing. In this article, we evaluate EEG emotion recognition by converting EEG signals from multiple channels into images such that richer spatial information can be considered and the question of EEG-based emotion recognition can be converted into image recognition. To this end, we propose a novel method to generate continuous images from discrete EEG signals by introducing offset variables following a Gaussian distribution for each EEG channel to alleviate the biased electrode coordinates during image generation. In addition, a novel graph-embedded convolutional neural network (GECNN) method is proposed to combine the local convolutional neural network (CNN) features with global functional features to provide complementary emotion information. In GECNN, the attention mechanism is applied to extract more discriminative local features. Simultaneously, dynamical graph filtering explores the intrinsic relationships between different EEG regions. The local and global functional features are finally fused for emotion recognition. Extensive experiments in subject-dependent and subject-independent protocols are conducted to evaluate the performance of the proposed GECNN model on four datasets, i.e., SEED, SDEA, DREAMER, and MPED. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2168-6750 2168-6750 |
DOI: | 10.1109/TETC.2021.3087174 |