Domain adaptation spatial feature perception neural network for cross-subject EEG emotion recognition

Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep lea...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in human neuroscience Vol. 18; p. 1471634
Main Authors Lu, Wei, Zhang, Xiaobo, Xia, Lingnan, Ma, Hua, Tan, Tien-Ping
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Media S.A 17.12.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Emotion recognition is a critical research topic within affective computing, with potential applications across various domains. Currently, EEG-based emotion recognition, utilizing deep learning frameworks, has been effectively applied and achieved commendable performance. However, existing deep learning-based models face challenges in capturing both the spatial activity features and spatial topology features of EEG signals simultaneously. To address this challenge, a d omain-adaptation s patial-feature p erception-network has been proposed for cross-subject EEG emotion recognition tasks, named DSP-EmotionNet. Firstly, a s patial a ctivity t opological f eature e xtractor m odule has been designed to capture spatial activity features and spatial topology features of EEG signals, named SATFEM. Then, using SATFEM as the feature extractor, DSP-EmotionNet has been designed, significantly improving the accuracy of the model in cross-subject EEG emotion recognition tasks. The proposed model surpasses state-of-the-art methods in cross-subject EEG emotion recognition tasks, achieving an average recognition accuracy of 82.5% on the SEED dataset and 65.9% on the SEED-IV dataset.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Edited by: Jiahui Pan, South China Normal University, China
Man Fai Leung, Anglia Ruskin University, United Kingdom
Reviewed by: Dong Cui, Yanshan University, China
ISSN:1662-5161
1662-5161
DOI:10.3389/fnhum.2024.1471634