Conditional probabilistic-based domain adaptation for cross-subject EEG-based emotion recognition

Electroencephalogram (EEG)-based emotion recognition has received increasing attention in affective computing. Due to the non-stationary and non-linear characteristics of EEG signals, EEG data exhibit significant individual differences. Previous studies have adopted domain adaptation strategies to m...

Full description

Saved in:
Bibliographic Details
Published inCognitive neurodynamics Vol. 19; no. 1; p. 84
Main Authors Cheng, Shichao, Wang, Yifan, Mei, Jiawei, Lin, Guang, Zhang, Jianhai, Kong, Wanzeng
Format Journal Article
LanguageEnglish
Published Dordrecht Springer Netherlands 01.12.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electroencephalogram (EEG)-based emotion recognition has received increasing attention in affective computing. Due to the non-stationary and non-linear characteristics of EEG signals, EEG data exhibit significant individual differences. Previous studies have adopted domain adaptation strategies to minimize the distribution gap between individuals and achieved reasonable results. However, due to ignoring the influence of individual-dependent background signals on task-dependent emotional signals, most of the research can only align source domain data and target domain data spatially as a whole. There may be confusion between categories. Based on this limitation, this paper proposes a conditional probabilistic-based domain adversarial network (CPDAN) for cross-subject EEG-based emotion recognition. According to the characteristics of cross-subject EEG signals, CPDAN uses different branch networks to separate the background features and task features from EEG signals. In addition, CPDAN uses domain-adversarial training to model the discrepancy in the global domain and local domain to reduce the intra-class distance and enlarge the inter-class distance. The extensive experiments on SEED and SEED-IV demonstrate that our proposed CPDAN framework outperforms the comparison methods. Especially on SEED-IV, the average accuracy of CPDAN has improved by 22% over the comparison method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1871-4080
1871-4099
DOI:10.1007/s11571-025-10272-8