Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
As an important component to promote the development of affective brain–computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Doma...
Saved in:
Published in | Frontiers in human neuroscience Vol. 16; p. 921346 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Lausanne
Frontiers Research Foundation
15.09.2022
Frontiers Media S.A |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | As an important component to promote the development of affective brain–computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Domain adaptation methods can effectively alleviate the generalization problem of EEG emotion recognition models. However, most of them treat multiple source domains, with significantly different distributions, as one single source domain, and only adapt the cross-domain marginal distribution while ignoring the joint distribution difference between the domains. To gain the advantages of multiple source distributions, and better match the distributions of the source and target domains, this paper proposes a novel multi-source joint domain adaptation (MSJDA) network. We first map all domains to a shared feature space and then align the joint distributions of the further extracted private representations and the corresponding classification predictions for each pair of source and target domains. Extensive cross-subject and cross-session experiments on the benchmark dataset, SEED, demonstrate the effectiveness of the proposed model, where more significant classification results are obtained on the more difficult cross-subject emotion recognition task. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 Edited by: Yasar Ayaz, National University of Sciences and Technology (NUST), Pakistan This article was submitted to Brain-Computer Interfaces, a section of the journal Frontiers in Human Neuroscience Reviewed by: Mengfan Li, Hebei University of Technology, China; Sidath R. Liyanage, University of Kelaniya, Sri Lanka |
ISSN: | 1662-5161 1662-5161 |
DOI: | 10.3389/fnhum.2022.921346 |