Generator-based Domain Adaptation Method with Knowledge Free for Cross-subject EEG Emotion Recognition

Most existing approaches for cross-subject electroencephalogram (EEG) emotion recognition learn the universal features between different subjects with the neurological findings. The performance of these methods may be sub-optimal due to the inadequate investigation of the relationships between the b...

Full description

Saved in:
Bibliographic Details
Published inCognitive computation Vol. 14; no. 4; pp. 1316 - 1327
Main Authors Huang, Dongmin, Zhou, Sijin, Jiang, Dazhi
Format Journal Article
LanguageEnglish
Published New York Springer US 01.07.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Most existing approaches for cross-subject electroencephalogram (EEG) emotion recognition learn the universal features between different subjects with the neurological findings. The performance of these methods may be sub-optimal due to the inadequate investigation of the relationships between the brain and the emotion. Hence, in case of insufficient neurological findings, it is essential to develop a domain adaptation method for EEG data. In this paper, we propose a generator-based domain adaptation method with knowledge free (GDAKF) mechanism for the cross-subject EEG emotion recognition. Specifically, the feature distribution of the source domain is transformed into a feature distribution of the target domain via adversarial learning between the generator and the discriminator. Additionally, the transformation process is constrained by the EEG content regression loss and emotion information loss to maintain the emotional information during the feature alignment. To evaluate the effectiveness and performance of GDAKF, many experiments are carried out on the benchmark dataset, DEAP. The experimental result shows that GDAKF achieves excellent performance with 63.85% mean accuracy in low/high valence, which shows that the proposed method is comparable to the EEG cross-subject emotion recognition methods in the literature. This paper provides a novel idea for addressing cross-subject EEG emotion recognition, and it can also be applied to cross-session and cross-device emotion recognition tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1866-9956
1866-9964
DOI:10.1007/s12559-022-10016-4