A Novel Conditional Adversarial Domain Adaptation Network for EEG Cross-subject Emotion Recognition

Cross-subject emotion recognition based on electroencephalography (EEG) is currently a major development direction for affective brain-computer interfaces (aBCI). Currently, researchers are focusing on using domain adversarial neural networks (DANN) to capture domain-invariant features and enhance t...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on affective computing pp. 1 - 13
Main Authors Huang, He, Si, Xiaopeng, Han, Yumeng, Ming, Dong
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Cross-subject emotion recognition based on electroencephalography (EEG) is currently a major development direction for affective brain-computer interfaces (aBCI). Currently, researchers are focusing on using domain adversarial neural networks (DANN) to capture domain-invariant features and enhance the cross-subject generalization of models. However, current DANN in the aBCI field cannot align features across different domains by directly estimating the differences between the source and target domains, and may struggle to effectively align feature distributions of different domains. Moreover, the current mainstream cross-subject evaluation protocols can result in inflated offline performance. To address the shortcomings of DANN, we develop a novel conditional adversarial domain adaptation network, which brings about a 10% performance improvement for the model. Specifically, by introducing a domain adapter, we gradually align the distributions of different domains during training to reduce domain differences. Additionally, we incorporate a conditioning strategy in the domain discriminator to effectively align distributions of different domains. We also develop a novel evaluation method that simulates an online scenario to address the issue of inflated offline performance. Extensive comparisons with existing methods demonstrate that the proposed approach achieves state-of-the-art cross-subject emotion recognition performance, attaining 93.62% accuracy on the SEED dataset and 82.16% on SEED-IV.
ISSN:1949-3045
1949-3045
DOI:10.1109/TAFFC.2025.3588873