Adversarial Domain Adaptation-Based EEG Emotion Transfer Recognition

This paper introduces an Emotion Domain Adversarial Neural Network (EDANN) model for Electroencephalogram (EEG) emotion recognition, designed to accomplish EEG emotion classification across various periods and subjects. The model is composed of three essential components: an encoder, a label classif...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 13; pp. 32706 - 32723
Main Authors Li, Ting, Wang, Zhan, Liu, Huijing
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper introduces an Emotion Domain Adversarial Neural Network (EDANN) model for Electroencephalogram (EEG) emotion recognition, designed to accomplish EEG emotion classification across various periods and subjects. The model is composed of three essential components: an encoder, a label classifier, and a domain discriminator. Utilizing adversarial training, EDANN can extract features that are discriminative among different categories and invariant across domains. The study employed two public datasets: the DEAP dataset, which includes EEG signals from 32 participants for emotion analysis; and the SEED dataset, comprising emotional responses from 15 Chinese participants to 15 Chinese film clips. Both datasets utilized specific emotion models for labeling emotions, albeit with varying levels of precision. Extensive experiments on both Session-to-Session and Subject-to-Subject transfer tasks have demonstrated the proposed model's superior performance in terms of accuracy, precision, recall, and F1 score for emotion recognition. The findings of this study not only illustrate the efficacy of EDANN in EEG-based emotion recognition but also underscore the importance of considering significant inter-individual differences when designing and evaluating machine learning models. These methodologies enable researchers to utilize limited data resources more efficiently, thus propelling the advancement of emotion recognition technology.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3540436