Adaptive Spatial–Temporal Aware Graph Learning for EEG-Based Emotion Recognition

An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in...

Full description

Saved in:
Bibliographic Details
Published inCyborg and Bionic Systems Vol. 5; p. 0088
Main Authors Ye, Weishan, Wang, Jiyuan, Chen, Lin, Dai, Lifei, Sun, Zhe, Liang, Zhen
Format Journal Article
LanguageEnglish
Published United States American Association for the Advancement of Science (AAAS) 01.01.2024
AAAS
Subjects
Online AccessGet full text
ISSN2692-7632
2097-1087
2692-7632
DOI10.34133/cbsystems.0088

Cover

Loading…
More Information
Summary:An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph’s adaptive nature is crucial in optimizing spatial–temporal graph convolutions, which in turn enhances spatial–temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model’s efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2692-7632
2097-1087
2692-7632
DOI:10.34133/cbsystems.0088