MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in neuroscience Vol. 15; p. 778488
Main Authors Chen, Hao, Jin, Ming, Li, Zhunan, Fan, Cunhang, Li, Jinpeng, He, Huiguang
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Research Foundation 07.12.2021
Frontiers Media S.A
Subjects
Online AccessGet full text

Cover

Loading…
Abstract As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA .
AbstractList As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA .
As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA.
As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA.As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA.
Author He, Huiguang
Li, Zhunan
Jin, Ming
Li, Jinpeng
Chen, Hao
Fan, Cunhang
AuthorAffiliation 1 HwaMei Hospital, University of Chinese Academy , Ningbo , China
2 Center for Pattern Recognition and Intelligent Medicine, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences , Ningbo , China
4 Research Center for Brain-inspired Intelligence and National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences , Beijing , China
3 Anhui Province Key Laboratory of Multimodal Cognitive Computation, School of Computer Science and Technology, Anhui University , Hefei , China
AuthorAffiliation_xml – name: 4 Research Center for Brain-inspired Intelligence and National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences , Beijing , China
– name: 1 HwaMei Hospital, University of Chinese Academy , Ningbo , China
– name: 3 Anhui Province Key Laboratory of Multimodal Cognitive Computation, School of Computer Science and Technology, Anhui University , Hefei , China
– name: 2 Center for Pattern Recognition and Intelligent Medicine, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences , Ningbo , China
Author_xml – sequence: 1
  givenname: Hao
  surname: Chen
  fullname: Chen, Hao
– sequence: 2
  givenname: Ming
  surname: Jin
  fullname: Jin, Ming
– sequence: 3
  givenname: Zhunan
  surname: Li
  fullname: Li, Zhunan
– sequence: 4
  givenname: Cunhang
  surname: Fan
  fullname: Fan, Cunhang
– sequence: 5
  givenname: Jinpeng
  surname: Li
  fullname: Li, Jinpeng
– sequence: 6
  givenname: Huiguang
  surname: He
  fullname: He, Huiguang
BackLink https://www.ncbi.nlm.nih.gov/pubmed/34949983$$D View this record in MEDLINE/PubMed
BookMark eNp9kktrGzEUhYeS0jzaH9BNGeimm3H1lqaLgnGcB8QUmha6E3qNKzOWHGmm0H9fjZ2EJIuudLn3u4crzjmtjkIMrqreQzDDWLSfu-BDniGA4IxzQYR4VZ1AxlBDKP519KQ-rk5z3gDAkCDoTXWMSUvaVuCTare6bVbn8y_1auwHn-OYjKtXKq19UH197vOQvB4HH0M9t2o3qH3ZxVQvUsy5uR31xpmhVsE-dFzOE7NcXtbLbdzz352J6-Cn-m31ulN9du_u37Pq58Xyx-Kqufl2eb2Y3zSGtHhoiOgUUBoY1gLiHILcagi17YQ2CjhOFdeUIMuFoIhaUEatBRbDFlPGiMFn1fVB10a1kbvktyr9lVF5uW_EtJYqDd70TmrCFMMAc-wsoW3RR9g4rSkX0DogitbXg9Zu1FtnjQtDUv0z0eeT4H_LdfwjBRNCEFgEPt0LpHg3ujzIrc_G9b0KLo5ZIgYJwoC2oKAfX6CbYkoxY6IAJwxRiAr14elFj6c8GFsAeADMZEpy3SMCgZzCI_fhkVN45CE8ZYe_2DH-YHj5lO__s_kPXV3LeQ
CitedBy_id crossref_primary_10_1016_j_compbiomed_2024_109394
crossref_primary_10_1016_j_engappai_2023_106205
crossref_primary_10_1016_j_inffus_2023_102156
crossref_primary_10_3389_fnhum_2024_1464431
crossref_primary_10_1109_TNNLS_2023_3243339
crossref_primary_10_1109_TCSS_2024_3412074
crossref_primary_10_1109_TCDS_2024_3432752
crossref_primary_10_1088_1741_2552_ad2710
crossref_primary_10_1016_j_jneumeth_2024_110276
crossref_primary_10_1016_j_knosys_2025_113018
crossref_primary_10_1016_j_neucom_2024_129315
crossref_primary_10_1063_5_0231511
crossref_primary_10_3389_fnhum_2023_1280241
crossref_primary_10_1007_s11517_023_02956_2
crossref_primary_10_1109_TAFFC_2024_3433470
crossref_primary_10_1016_j_knosys_2023_111199
crossref_primary_10_1007_s11571_023_10004_w
crossref_primary_10_1109_TIM_2024_3398103
crossref_primary_10_1007_s11571_024_10193_y
crossref_primary_10_1109_TAFFC_2024_3371540
crossref_primary_10_1007_s11571_024_10092_2
crossref_primary_10_1016_j_bspc_2025_107536
crossref_primary_10_1155_2024_7499554
crossref_primary_10_1155_int_9618884
crossref_primary_10_1109_JBHI_2023_3311338
crossref_primary_10_3390_electronics13010186
crossref_primary_10_1016_j_bspc_2023_104741
crossref_primary_10_1016_j_neunet_2024_106624
crossref_primary_10_1109_TIM_2023_3302938
crossref_primary_10_3389_fnins_2023_1213099
crossref_primary_10_1109_ACCESS_2025_3536549
crossref_primary_10_1109_TIM_2023_3277985
crossref_primary_10_1016_j_measurement_2024_116046
crossref_primary_10_1109_RBME_2024_3449790
crossref_primary_10_3390_brainsci14030271
crossref_primary_10_1007_s11517_024_03032_z
crossref_primary_10_1016_j_neunet_2024_106742
crossref_primary_10_3390_lubricants12050175
crossref_primary_10_1109_TNSRE_2022_3207494
crossref_primary_10_1016_j_eswa_2024_126028
crossref_primary_10_1109_ACCESS_2024_3349552
crossref_primary_10_1016_j_bspc_2024_106953
crossref_primary_10_1007_s00521_024_10821_y
crossref_primary_10_1016_j_jvcir_2025_104415
crossref_primary_10_1109_TAFFC_2023_3282704
crossref_primary_10_1109_TAFFC_2024_3394436
crossref_primary_10_1109_TAFFC_2024_3357656
crossref_primary_10_3389_fnins_2024_1293962
crossref_primary_10_1016_j_brainresbull_2024_110901
crossref_primary_10_1016_j_engappai_2025_110004
crossref_primary_10_3389_fnhum_2023_1132254
crossref_primary_10_1016_j_inffus_2023_102129
crossref_primary_10_7717_peerj_cs_2065
crossref_primary_10_1016_j_eswa_2024_125452
crossref_primary_10_1016_j_knosys_2023_111011
crossref_primary_10_1109_ACCESS_2024_3458833
crossref_primary_10_1109_TETCI_2024_3406422
crossref_primary_10_1109_TAFFC_2023_3288118
crossref_primary_10_1109_JBHI_2022_3210158
crossref_primary_10_1016_j_knosys_2024_112669
crossref_primary_10_1109_TNSRE_2024_3415364
crossref_primary_10_1109_TAFFC_2024_3433613
crossref_primary_10_1145_3654664
crossref_primary_10_1016_j_bspc_2023_105138
crossref_primary_10_1016_j_bspc_2024_107337
crossref_primary_10_1016_j_bspc_2024_107019
crossref_primary_10_1016_j_compbiomed_2024_108857
crossref_primary_10_1109_TAFFC_2024_3395359
crossref_primary_10_1088_1741_2552_ace07d
crossref_primary_10_1007_s11760_024_03533_2
crossref_primary_10_3389_fphys_2023_1196919
crossref_primary_10_1016_j_bspc_2023_104998
crossref_primary_10_1109_TAFFC_2023_3305982
crossref_primary_10_1016_j_bspc_2024_106912
crossref_primary_10_1080_10255842_2024_2417212
crossref_primary_10_1016_j_bspc_2024_106716
crossref_primary_10_1016_j_jneumeth_2023_109978
crossref_primary_10_1016_j_knosys_2022_108819
crossref_primary_10_3389_fpsyg_2021_809459
Cites_doi 10.1109/SMC.2019.8914645
10.1609/aaai.v35i1.16169
10.1109/TNSRE.2020.3043426
10.1109/NER.2013.6695876
10.1109/T-AFFC.2011.15
10.1088/1741-2552/aaf3f6
10.1109/BIBM49941.2020.9312984
10.1007/978-3-030-04221-9_36
10.3389/fpsyg.2017.01454
10.1126/science.1076358
10.1016/j.neucom.2019.05.108
10.1109/TAFFC.2015.2436926
10.1080/02699930903407948
10.1145/3343031.3350871
10.1109/TNN.2010.2091281
10.1109/TAMD.2015.2431497
10.1109/TKDE.2009.191
10.1109/TCYB.2018.2797176
10.3390/s17051014
10.1162/tacl_a_00097
10.1016/j.compbiomed.2016.10.019
10.1016/j.patcog.2020.107626
10.1109/TNSRE.2021.3059429
10.1109/TNSRE.2015.2466079
10.1093/bioinformatics/btl242
10.1007/s10916-019-1345-y
10.1109/TCDS.2019.2949306
10.1111/j.1469-8986.2006.00456.x
10.1080/02699930143000239
10.1016/j.cmpb.2012.10.008
10.1080/13607860410001669750
10.1109/TCYB.2019.2904052
10.1145/1970392.1970395
10.1037/0003-066X.48.4.384
10.3389/fncom.2017.00064
10.1109/ICOT.2017.8336126
10.1159/000381950
10.1016/S1352-2310(97)00447-0
10.1109/TSMCC.2012.2226444
10.1016/B978-0-12-801851-4.00001-X
ContentType Journal Article
Copyright Copyright © 2021 Chen, Jin, Li, Fan, Li and He.
2021. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright © 2021 Chen, Jin, Li, Fan, Li and He. 2021 Chen, Jin, Li, Fan, Li and He
Copyright_xml – notice: Copyright © 2021 Chen, Jin, Li, Fan, Li and He.
– notice: 2021. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: Copyright © 2021 Chen, Jin, Li, Fan, Li and He. 2021 Chen, Jin, Li, Fan, Li and He
DBID AAYXX
CITATION
NPM
3V.
7XB
88I
8FE
8FH
8FK
ABUWG
AFKRA
AZQEC
BBNVY
BENPR
BHPHI
CCPQU
DWQXO
GNUQQ
HCIFZ
LK8
M2P
M7P
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
7X8
5PM
DOA
DOI 10.3389/fnins.2021.778488
DatabaseName CrossRef
PubMed
ProQuest Central (Corporate)
ProQuest Central (purchase pre-March 2016)
Science Database (Alumni Edition)
ProQuest SciTech Collection
ProQuest Natural Science Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
Biological Science Collection (ProQuest)
ProQuest Central
Natural Science Collection
ProQuest One Community College
ProQuest Central
ProQuest Central Student
SciTech Premium Collection
ProQuest Biological Science Collection
Science Database
Biological Science Database
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Open Access Full Text
DatabaseTitle CrossRef
PubMed
Publicly Available Content Database
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Natural Science Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
Natural Science Collection
ProQuest Central Korea
Biological Science Collection
ProQuest Central (New)
ProQuest Science Journals (Alumni Edition)
ProQuest Biological Science Collection
ProQuest Central Basic
ProQuest Science Journals
ProQuest One Academic Eastern Edition
Biological Science Database
ProQuest SciTech Collection
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList
PubMed
Publicly Available Content Database
CrossRef
MEDLINE - Academic

Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Anatomy & Physiology
EISSN 1662-453X
ExternalDocumentID oai_doaj_org_article_b46a630373ed459ca023cebb5781de08
PMC8688841
34949983
10_3389_fnins_2021_778488
Genre Journal Article
GrantInformation_xml – fundername: ;
GroupedDBID ---
29H
2WC
53G
5GY
5VS
88I
8FE
8FH
9T4
AAFWJ
AAYXX
ABUWG
ACGFO
ACGFS
ACXDI
ADRAZ
AEGXH
AENEX
AFKRA
AFPKN
AIAGR
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BBNVY
BENPR
BHPHI
BPHCQ
CCPQU
CITATION
CS3
DIK
DU5
DWQXO
E3Z
EBS
EJD
EMOBN
F5P
FRP
GNUQQ
GROUPED_DOAJ
GX1
HCIFZ
HYE
KQ8
LK8
M2P
M48
M7P
O5R
O5S
OK1
OVT
P2P
PGMZT
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
RNS
RPM
W2D
C1A
IAO
IEA
IHR
ISR
M~E
NPM
3V.
7XB
8FK
PKEHL
PQEST
PQGLB
PQUKI
PRINS
Q9U
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c493t-48fa0ab0c6904ee217db11bdf8bca0e75a7b542d788525d01bd9d0d31935664c3
IEDL.DBID M48
ISSN 1662-453X
1662-4548
IngestDate Wed Aug 27 01:31:21 EDT 2025
Thu Aug 21 18:29:52 EDT 2025
Thu Jul 10 19:25:32 EDT 2025
Fri Jul 25 11:53:32 EDT 2025
Thu Jan 02 22:55:45 EST 2025
Tue Jul 01 01:39:30 EDT 2025
Thu Apr 24 22:51:20 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords transfer learning
domain adaptation
emotion recognition
EEG
brain-computer interface
Language English
License Copyright © 2021 Chen, Jin, Li, Fan, Li and He.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c493t-48fa0ab0c6904ee217db11bdf8bca0e75a7b542d788525d01bd9d0d31935664c3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
This article was submitted to Brain Imaging Methods, a section of the journal Frontiers in Neuroscience
Reviewed by: Huan Liu, Jiangsu University, China; Linling Li, Shenzhen University, China
Edited by: Yudan Ren, Northwest University, China
OpenAccessLink http://journals.scholarsportal.info/openUrl.xqy?doi=10.3389/fnins.2021.778488
PMID 34949983
PQID 2607462512
PQPubID 4424402
ParticipantIDs doaj_primary_oai_doaj_org_article_b46a630373ed459ca023cebb5781de08
pubmedcentral_primary_oai_pubmedcentral_nih_gov_8688841
proquest_miscellaneous_2614230590
proquest_journals_2607462512
pubmed_primary_34949983
crossref_primary_10_3389_fnins_2021_778488
crossref_citationtrail_10_3389_fnins_2021_778488
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-12-07
PublicationDateYYYYMMDD 2021-12-07
PublicationDate_xml – month: 12
  year: 2021
  text: 2021-12-07
  day: 07
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Lausanne
PublicationTitle Frontiers in neuroscience
PublicationTitleAlternate Front Neurosci
PublicationYear 2021
Publisher Frontiers Research Foundation
Frontiers Media S.A
Publisher_xml – name: Frontiers Research Foundation
– name: Frontiers Media S.A
References Barrett (B3) 2001; 15
Zhang (B46) 2020; 29
Li (B26) 2020
Jeon (B17) 2017
Lin (B29) 2017; 11
Chai (B9) 2016; 79
Li (B25) 2018
Zhu (B51) 2019
Frisoli (B14) 2012; 42
Jiang (B18) 2021; 29
Borgwardt (B5) 2006; 22
Candès (B7) 2011; 58
Yin (B45) 2016; 4
Chai (B8) 2017; 17
Duan (B11) 2013
Ay (B2) 2019; 43
Kingma (B22) 2014
Li (B27); 12
Nair (B33) 2010
Dolan (B10) 2002; 298
Ma (B32) 2019
Zhao (B47) 2021
Xu (B44) 2015
Zheng (B50) 2016
Hosseinifard (B16) 2013; 109
Joormann (B21) 2010; 24
Ekman (B12) 1993; 48
Sanei (B36) 2013
Jiao (B19) 2020; 408
Fahimi (B13) 2019; 16
Van der Maaten (B42) 2008; 9
Zheng (B49) 2015; 7
Pan (B34) 2010; 22
Zheng (B48) 2018; 49
Birbaumer (B4) 2006; 43
Tao (B39) 2020
Lee (B24) 2019
Bucks (B6) 2004; 8
Long (B31) 2015
Soleymani (B37) 2015; 7
Sun (B38) 2016
Wang (B43) 2021; 110
Acharya (B1) 2015; 73
Pan (B35) 2009; 22
Gardner (B15) 1998; 32
Koelstra (B23) 2011; 3
Li (B28); 50
Liu (B30) 2015; 24
Tyng (B40) 2017; 8
Jin (B20) 2017
Tzeng (B41) 2014
References_xml – start-page: 4409
  volume-title: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC)
  year: 2019
  ident: B24
  article-title: Towards an eeg-based intuitive bci communication system using imagined speech and visual imagery
  doi: 10.1109/SMC.2019.8914645
– volume-title: Proceedings of the 35th AAAI Conference on Artificial Intelligence
  year: 2021
  ident: B47
  article-title: Plug-and-play domain adaptation for cross-subject eeg-based emotion recognition
  doi: 10.1609/aaai.v35i1.16169
– volume: 29
  start-page: 215
  year: 2020
  ident: B46
  article-title: Brain functional networks based on resting-state eeg data for major depressive disorder analysis and classification
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng
  doi: 10.1109/TNSRE.2020.3043426
– start-page: 81
  volume-title: 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER)
  year: 2013
  ident: B11
  article-title: Differential entropy feature for eeg-based emotion classification
  doi: 10.1109/NER.2013.6695876
– volume: 3
  start-page: 18
  year: 2011
  ident: B23
  article-title: Deap: a database for emotion analysis; using physiological signals
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/T-AFFC.2011.15
– volume: 16
  start-page: 026007
  year: 2019
  ident: B13
  article-title: Inter-subject transfer learning with an end-to-end deep convolutional neural network for eeg-based bci
  publication-title: J. Neural Eng
  doi: 10.1088/1741-2552/aaf3f6
– start-page: 5989
  volume-title: Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33
  year: 2019
  ident: B51
  article-title: Aligning domain-specific distribution and classifier for cross-domain classification from multiple sources
– start-page: 2618
  volume-title: 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
  year: 2020
  ident: B26
  article-title: Foit: fast online instance transfer for improved eeg emotion recognition
  doi: 10.1109/BIBM49941.2020.9312984
– start-page: 403
  volume-title: International Conference on Neural Information Processing
  year: 2018
  ident: B25
  article-title: Cross-subject emotion recognition using deep adaptation networks
  doi: 10.1007/978-3-030-04221-9_36
– volume: 8
  start-page: 1454
  year: 2017
  ident: B40
  article-title: The influences of emotion on learning and memory
  publication-title: Front. Psychol
  doi: 10.3389/fpsyg.2017.01454
– volume: 298
  start-page: 1191
  year: 2002
  ident: B10
  article-title: Emotion, cognition, and behavior
  publication-title: Science
  doi: 10.1126/science.1076358
– start-page: 2732
  volume-title: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence
  year: 2016
  ident: B50
  article-title: Personalizing eeg-based affective models with transfer learning
– year: 2015
  ident: B44
  article-title: Empirical evaluation of rectified activations in convolutional network
  publication-title: arXiv preprint
– year: 2014
  ident: B22
  article-title: Adam: a method for stochastic optimization
  publication-title: arXiv preprint
– start-page: 443
  volume-title: European Conference on Computer Vision
  year: 2016
  ident: B38
  article-title: Deep coral: correlation alignment for deep domain adaptation
– start-page: 97
  volume-title: International Conference on Machine Learning
  year: 2015
  ident: B31
  article-title: Learning transferable features with deep adaptation networks
– volume: 408
  start-page: 100
  year: 2020
  ident: B19
  article-title: Driver sleepiness detection from eeg and eog signals using gan and lstm networks
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2019.05.108
– volume: 7
  start-page: 17
  year: 2015
  ident: B37
  article-title: Analysis of eeg signals and facial expressions for continuous emotion detection
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/TAFFC.2015.2436926
– volume: 24
  start-page: 281
  year: 2010
  ident: B21
  article-title: Emotion regulation in depression: relation to cognitive inhibition
  publication-title: Cogn. Emot
  doi: 10.1080/02699930903407948
– start-page: 176
  volume-title: Proceedings of the 27th ACM International Conference on Multimedia
  year: 2019
  ident: B32
  article-title: Emotion recognition using multimodal residual lstm network
  doi: 10.1145/3343031.3350871
– volume: 22
  start-page: 199
  year: 2010
  ident: B34
  article-title: Domain adaptation via transfer component analysis
  publication-title: IEEE Trans. Neural Netw
  doi: 10.1109/TNN.2010.2091281
– volume: 7
  start-page: 162
  year: 2015
  ident: B49
  article-title: Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev
  doi: 10.1109/TAMD.2015.2431497
– volume: 22
  start-page: 1345
  year: 2009
  ident: B35
  article-title: A survey on transfer learning
  publication-title: IEEE Trans Knowl Data Eng
  doi: 10.1109/TKDE.2009.191
– volume: 9
  start-page: 2579
  year: 2008
  ident: B42
  article-title: Visualizing data using t-sne
  publication-title: J. Mach. Learn. Res
– volume: 49
  start-page: 1110
  year: 2018
  ident: B48
  article-title: Emotionmeter: a multimodal framework for recognizing human emotions
  publication-title: IEEE Trans. Cybern
  doi: 10.1109/TCYB.2018.2797176
– volume: 17
  start-page: 1014
  year: 2017
  ident: B8
  article-title: A fast, efficient domain adaptation technique for cross-domain electroencephalography (eeg)-based emotion recognition
  publication-title: Sensors
  doi: 10.3390/s17051014
– volume: 4
  start-page: 259
  year: 2016
  ident: B45
  article-title: Abcnn: attention-based convolutional neural network for modeling sentence pairs
  publication-title: Trans. Assoc. Comput. Linguist
  doi: 10.1162/tacl_a_00097
– volume: 79
  start-page: 205
  year: 2016
  ident: B9
  article-title: Unsupervised domain adaptation techniques based on auto-encoder for non-stationary eeg-based emotion recognition
  publication-title: Comput. Biol. Med
  doi: 10.1016/j.compbiomed.2016.10.019
– volume: 110
  start-page: 107626
  year: 2021
  ident: B43
  article-title: A prototype-based spd matrix network for domain adaptation eeg emotion recognition
  publication-title: Pattern Recognit
  doi: 10.1016/j.patcog.2020.107626
– volume: 29
  start-page: 566
  year: 2021
  ident: B18
  article-title: Enhancing eeg-based classification of depression patients using spatial information
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng
  doi: 10.1109/TNSRE.2021.3059429
– volume: 24
  start-page: 169
  year: 2015
  ident: B30
  article-title: A boosting-based spatial-spectral model for stroke patients' eeg analysis in rehabilitation training
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng
  doi: 10.1109/TNSRE.2015.2466079
– volume: 22
  start-page: e49
  year: 2006
  ident: B5
  article-title: Integrating structured biological data by kernel maximum mean discrepancy
  publication-title: Bioinformatics
  doi: 10.1093/bioinformatics/btl242
– volume: 43
  start-page: 1
  year: 2019
  ident: B2
  article-title: Automated depression detection using deep representation and sequence learning with eeg signals
  publication-title: J. Med. Syst
  doi: 10.1007/s10916-019-1345-y
– volume: 12
  start-page: 344
  ident: B27
  article-title: Domain adaptation for eeg emotion recognition based on latent representation similarity
  publication-title: IEEE Trans. Cogn. Dev. Syst
  doi: 10.1109/TCDS.2019.2949306
– volume: 43
  start-page: 517
  year: 2006
  ident: B4
  article-title: Breaking the silence: brain-computer interfaces (bci) for communication and motor control
  publication-title: Psychophysiology
  doi: 10.1111/j.1469-8986.2006.00456.x
– year: 2014
  ident: B41
  article-title: Deep domain confusion: maximizing for domain invariance
  publication-title: arXiv preprint
– volume-title: EEG Signal Processing
  year: 2013
  ident: B36
– volume: 15
  start-page: 713
  year: 2001
  ident: B3
  article-title: Knowing what you're feeling and knowing what to do about it: mapping the relation between emotion differentiation and emotion regulation
  publication-title: Cogn. Emot
  doi: 10.1080/02699930143000239
– volume: 109
  start-page: 339
  year: 2013
  ident: B16
  article-title: Classifying depression patients and normal subjects using machine learning techniques and nonlinear features from eeg signal
  publication-title: Comput. Methods Programs Biomed
  doi: 10.1016/j.cmpb.2012.10.008
– volume: 8
  start-page: 222
  year: 2004
  ident: B6
  article-title: Emotion processing in alzheimer's disease
  publication-title: Aging Mental Health
  doi: 10.1080/13607860410001669750
– volume: 50
  start-page: 3281
  ident: B28
  article-title: Multisource transfer learning for cross-subject eeg emotion recognition
  publication-title: IEEE Trans. Cybern
  doi: 10.1109/TCYB.2019.2904052
– volume: 58
  start-page: 1
  year: 2011
  ident: B7
  article-title: Robust principal component analysis?
  publication-title: J. ACM
  doi: 10.1145/1970392.1970395
– volume: 48
  start-page: 384
  year: 1993
  ident: B12
  article-title: Facial expression and emotion
  publication-title: Am. Psychol
  doi: 10.1037/0003-066X.48.4.384
– volume: 11
  start-page: 64
  year: 2017
  ident: B29
  article-title: Improving cross-day eeg-based emotion classification using robust principal component analysis
  publication-title: Front. Comput. Neurosci
  doi: 10.3389/fncom.2017.00064
– start-page: 222
  volume-title: 2017 International Conference on Orange Technologies (ICOT)
  year: 2017
  ident: B20
  article-title: Eeg-based emotion recognition using domain adaptation network
  doi: 10.1109/ICOT.2017.8336126
– volume: 73
  start-page: 329
  year: 2015
  ident: B1
  article-title: Computer-aided diagnosis of depression using eeg signals
  publication-title: Eur. Neurol
  doi: 10.1159/000381950
– start-page: 1
  volume-title: 2020 International Joint Conference on Neural Networks (IJCNN)
  year: 2020
  ident: B39
  article-title: Emotion recognition under sleep deprivation using a multimodal residual lstm network
– volume: 32
  start-page: 2627
  year: 1998
  ident: B15
  article-title: Artificial neural networks (the multilayer perceptron)—review of applications in the atmospheric sciences
  publication-title: Atmos. Environ
  doi: 10.1016/S1352-2310(97)00447-0
– volume-title: ICML
  year: 2010
  ident: B33
  article-title: Rectified linear units improve restricted boltzmann machines
– volume: 42
  start-page: 1169
  year: 2012
  ident: B14
  article-title: A new gaze-bci-driven control of an upper limb exoskeleton for rehabilitation in real-world tasks
  publication-title: IEEE Trans. Syst. Man Cybern. C
  doi: 10.1109/TSMCC.2012.2226444
– start-page: 3
  volume-title: Emotions and Affect in Human Factors and Human-Computer Interaction
  year: 2017
  ident: B17
  article-title: Emotions and affect in human factors and human-computer interaction: taxonomy, theories, approaches, and methods
  doi: 10.1016/B978-0-12-801851-4.00001-X
SSID ssj0062842
Score 2.6218529
Snippet As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved...
SourceID doaj
pubmedcentral
proquest
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 778488
SubjectTerms Adaptation
Alzheimer's disease
Brain research
brain-computer interface
Classification
domain adaptation
EEG
Electroencephalography
emotion recognition
Emotional regulation
Emotions
Experiments
Machine learning
Mental disorders
Methods
Neuroscience
Rehabilitation
transfer learning
SummonAdditionalLinks – databaseName: DOAJ Open Access Full Text
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwEB6hnnpBQHkESmUkxAHJNImd2OG2bbdUSMsBqNRb5FdEpeJdoe2h_74zdrLtIgQXrraTOJ6xPeMZfx_A29A6b_VQcWut52iBK26EcryurKi9Etp5iuguvrRn5_LzRXNxj-qLcsIyPHAeuEMrW9PiOqtE8LLpnMFNxgVrUdMqH_I1X9zzJmcqr8EtLrp1jmGiC9YdDvEyEjZ3XX1QSstEs3K3CyWw_j9ZmL8nSt7beU4fwcPRZGSz3NXH8CDEJ7A3i-gu_7xh71hK4kyn43uwWnzji5PZR5Yu1uaTeUZktkR-xU4IJXckuGIzb1Y5Ds_QcGXH1FGO6wgdzDAT_VSSgTvYfP6JzTPnD_s6ZR0t41M4P51_Pz7jI6kCd7ITay71YEpjS4dusQwBPRJvq8r6QVsc26Aao2wjUUxaN3XjS6zqfOlxpgq0_KQTz2AnLmN4ASwII6WRDh1bixUNvkBLUZamC0NXmqaAchrk3o2I40R8cdWj50Fy6ZNcepJLn-VSwPvNI6sMt_G3xkckuU1DQspOBag__ag__b_0p4D9Se79OH3xIy3RsJDpV8CbTTVOPIqmmBiW19SmQlOU7u4W8DyryaYnGfNHiwLUlgJtdXW7Jl7-SODeutVay-rl__i3V7BLw5Wyb9Q-7Kx_XYfXaEOt7UGaLrelYByL
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: ProQuest Central
  dbid: BENPR
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELZge-GCCuURKMhIiAOSqRM7scMFbduUCmkrVKjUW-TXlkqQbMv2wL9nxnECi1CvseM4Hs94Xv6GkNehct7qZc6stZ6BBq6YEcqxIrei8Epo5zGiuzipjs_kp_PyPDncfqa0ylEmRkHte4c-8j3Qu5Ws8DT-sLpiWDUKo6uphMZdsgUiWOsZ2dpvTj6fjrK4AuEb450V3g0C5XyIa4JZVu8tu8sO8bqL_J1SWsbSK39Opgjg_z-t89_kyb9Oo6Ntcj-pkXQ-0P0BuRO6h2Rn3oEJ_eMXfUNjYmf0mO-Q1eILWxzO39N42Xbw1lMscIsFseghIuemold07s1qiM1TUGbpAU6UgWxBZw01nR-fDGAetGk-0maoA0RPx0ykvntEzo6arwfHLBVaYE7WYs2kXhpuLHdgKssQwErxNs-tX2rrDA-qNMqWEkindVmUnkNT7bkH7hWgDUonHpNZ13fhKaFBGCmNdGDsWmgoYQAtBeemDsuamzIjfFzk1iUUciyG8b0FawTp0ka6tEiXdqBLRt5Or6wGCI7bOu8j5aaOiJ4dH_TXF21ixtbKylRwdisRvCxr-MdCuGAtSK_cBw6D7I50bxNLw0emDZiRV1MzMCNGWEwX-hvsk4N6ivd5M_Jk2CbTTAYcIC0yojY20MZUN1u6y28R8FtXWmuZP7t9Ws_JPVyImGujdslsfX0TXoDGtLYvE1v8BorhFz0
  priority: 102
  providerName: ProQuest
Title MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
URI https://www.ncbi.nlm.nih.gov/pubmed/34949983
https://www.proquest.com/docview/2607462512
https://www.proquest.com/docview/2614230590
https://pubmed.ncbi.nlm.nih.gov/PMC8688841
https://doaj.org/article/b46a630373ed459ca023cebb5781de08
Volume 15
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3_a9QwFA9zA9kvos4vnfOIIP4gdKZN2qSCyG3rNoQbMj2430rSpDrYeud5A_ff-17SFk9Of-kPTRrSvPfyviWfR8hrl9fWqCaJjTE2BgtcxprLOk4Tw1MruaotZnQnF_n5VHyaZbMt0pe36hbw50bXDutJTZfXh79-3H0Egf-AHifo23dNe9Ui8naaHEqpgCPvkR1QTBILGkzEkFTIYSf2yc8cLwqBpR6SnJuH2CX3A3CL4msaywP7b7JG_z5U-YeWOn1IHnTmJR0HfnhEtlz7mOyNW3Ctb-7oG-oPfPpI-h5ZTL7Ek5Pxe-ov4YYoPsXCt1goi54gom5XDIuOrV6EnD0FI5ce40Rj2HMwiEN1a_s3AeSDluUZLUN9IHrZn1Cat0_I9LT8enwedwUY4loUfBUL1WimDavBhRbOgfdiTZIY2yhTa-ZkpqXJBJBUqSzNLIOmwjILUs3BShQ1f0q223nrnhPquBZCixqcYAMNGQygBGdMF64pmM4iwvpFruoOnRyLZFxX4KUgiSpPogpJVAUSReTt8MkiQHP8r_MRUm7oiKja_sV8-a3qhLQyItc56HTJnRVZAf-Y8toZA7taYh2DQQ56ulc9p1bgEEqRo5kYkVdDMwgpZl506-a32CcBsxXv-UbkWWCTYSY9m0VErjHQ2lTXW9qr7x4IXOVKKZHs_3PMF2QX18Afv5EHZHu1vHUvwYhamRHZOSovPl-OfBACnmezZOTF5TfPmR1N
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELdG9wAvCBgfhQFGAh6QwpzYiR0khLq1o2NrhcYm7S3zV7dJkJbRCe2f4m_kzk4KRWhve41d18mdz7_zne9HyEtfWGfUJE2MMS4BBC4TzaVNstTwzEmurMOI7mhcDA_Fp6P8aIX8au_CYFplaxODoXZTi2fkG4C7pShwN_4w-54gaxRGV1sKjagWu_7yJ7hsP97v9EG-r7Jse3CwNUwaVoHEipLPE6EmmmnDLPiFwnuA5M6kqXETZaxmXuZamlzAPJXKs9wxaCodc6CqHKCPsBzGvUFWBS9Y1iGrm4Px5_3W9hdg7EN8tcC7SOAMxDgquIHlxqQ-q7E-eJa-lVKJQPXyZycMhAH_Q7n_Jmv-tftt3yG3G9hKe1HP7pIVX98ja70aXPZvl_Q1DYmk4YR-jcxGX5JRv_eOhsu9MTpAkVAXCbhoHyv1NiRbtOf0LOYCUADPdAsnmoAtw8MhqmvXPonFQ-hg8JEOIu8Q3W8zn6b1fXJ4LSJ4QDr1tPaPCPVcC6GFBefaQEMOAyjBGdOln5RM513C2o9c2abqOZJvfK3A-0G5VEEuFcqlinLpkjeLn8xiyY-rOm-i5BYdsVp3eDA9P6maxV8ZUegCsILk3om8hHfMuPXGgLVMnWcwyHor96oxIfAnC4XvkheLZlj8GNHRtZ9eYJ8U4DDeH-6Sh1FNFjOJdYcU7xK5pEBLU11uqc9OQ4FxVSilRPr46mk9JzeHB6O9am9nvPuE3MKPEvJ85DrpzM8v_FNAa3PzrFkilBxf96r8DQ3-VCQ
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELdGJyFeEDA-MgYYCXhACk1iJ3aQEOrWlo3RahpM2lvwV9gkSMvohPav8ddxZyeFIrS3vcau6-TO59_5zvcj5JkrjNWyTmOttY0BgYtYMWHiLNUss4JJYzGiO5kWu0f8_XF-vEZ-dXdhMK2ys4neUNuZwTPyPuBuwQvcjft1mxZxMBy_nX-PkUEKI60dnUZQkX138RPctx9v9oYg6-dZNh592tmNW4aB2PCSLWIua5UonRjwEblzAM-tTlNta6mNSpzIldA5hzlLmWe5TaCptIkFtWUAg7hhMO41si7QK-qR9e3R9OCw2wcKMPw-1lrgvSRwDEJMFVzCsl83pw3WCs_SV0JI7mlf_uyKnjzgf4j338TNv3bC8S1ys4WwdBB07jZZc80dsjFowH3_dkFfUJ9U6k_rN8h88jGeDAevqb_oGyIFFMl1kYyLDrFqb0u4RQdWzUNeAAUgTXdwojHYNTwooqqx3ZNQSISORu_oKHAQ0cMuC2rW3CVHVyKCe6TXzBr3gFDHFOeKG3C0NTTkMIDkLElU6eoyUXlEku4jV6atgI5EHF8r8IRQLpWXS4VyqYJcIvJy-ZN5KP9xWedtlNyyI1bu9g9mZ1-q1hBUmheqANwgmLM8L-EdM2ac1mA5U-sSGGSrk3vVmhP4k6XyR-TpshkMAUZ3VONm59gnBWiMd4kjcj-oyXImoQaRZBERKwq0MtXVlub0xBcbl4WUkqebl0_rCbkOq7H6sDfdf0hu4DfxKT9ii_QWZ-fuEQC3hX7crhBKPl_1ovwNz7tYWQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=MS-MDA%3A+Multisource+Marginal+Distribution+Adaptation+for+Cross-Subject+and+Cross-Session+EEG+Emotion+Recognition&rft.jtitle=Frontiers+in+neuroscience&rft.au=Chen%2C+Hao&rft.au=Jin%2C+Ming&rft.au=Li%2C+Zhunan&rft.au=Fan%2C+Cunhang&rft.date=2021-12-07&rft.issn=1662-4548&rft.volume=15&rft.spage=778488&rft_id=info:doi/10.3389%2Ffnins.2021.778488&rft_id=info%3Apmid%2F34949983&rft.externalDocID=34949983
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-453X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-453X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-453X&client=summon