MTADA: A Multi-task Adversarial Domain Adaptation Network for EEG-based Cross-subject Emotion Recognition

In electroencephalogram (EEG)-based emotion recognition, the applicability of most current models is limited by inter-subject variability and emotion complexity. This study proposes a multi-task adversarial domain adaptation (MTADA) network to enhance cross-subject emotion recognition performance. T...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on affective computing pp. 1 - 15
Main Authors Qiu, Lina, Ying, Zuorui, Song, Xianyue, Feng, Weisen, Zhou, Chengju, Pan, Jiahui
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In electroencephalogram (EEG)-based emotion recognition, the applicability of most current models is limited by inter-subject variability and emotion complexity. This study proposes a multi-task adversarial domain adaptation (MTADA) network to enhance cross-subject emotion recognition performance. The model first employs a domain matching strategy to select the source domain that best matches the target domain. Then, adversarial domain adaptation is used to learn the difference between source and target domains, and a fine-grained joint domain discriminator is constructed to align them by incorporating category information. At the same time, a multi-task learning mechanism is utilized to learn the intrinsic relationships between different emotions and predict multiple emotions simultaneously. We conducted comprehensive experiments on two public datasets, DEAP and FACED. On DEAP, the average accuracies for valence, arousal and dominance are 76.39%, 69.74% and 68.26%, respectively. On FACED, the average accuracies for valence and arousal are 78.90% and 77.95%. When using the subject from DEAP as the source domain to predict the subjects in FACED, the accuracies for valence and arousal are 61.07% and 60.82%. These results show that our MTADA model improves cross-subject emotion recognition and outperforms most state-of-the-art methods, which may provide new approach for EEG-based emotion brain-computer interface systems.
ISSN:1949-3045
1949-3045
DOI:10.1109/TAFFC.2025.3595137