CoAdapt: Collaborative Adaptation Between Latent EEG Feature Representation and Annotation for Emotion Decoding

Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the inherent EEG characteristics such as nonstationarity and weakness, combined with the possible limited immersion and carry-over effect of subjects...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 74; pp. 1 - 16
Main Authors Gong, Xiaoxiao, Chen, Yuxin, Zhang, Pengfei, Peng, Yong, Fang, Jinglong, Cichocki, Andrzej
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text
ISSN0018-9456
1557-9662
DOI10.1109/TIM.2025.3590828

Cover

Abstract Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the inherent EEG characteristics such as nonstationarity and weakness, combined with the possible limited immersion and carry-over effect of subjects during data collection experiments, may cause that the semantic meaning of extracted EEG feature vector cannot well match its annotated emotional state, dubbed the 'feature-label inconsistency dilemma in EEG-based emotion decoding. To this end, this article proposes to alleviate the side effect of feature-label inconsistency from both feature and label aspects. On the one hand, we explore more meaningful emotion-related EEG representation by the latent low-rank representation (LRR). On the other hand, we enhance the correspondence between the explored EEG representation and its annotated emotional state by a label dragging strategy. As a result, a collaborative adaptation (CoAdapt) model between latent EEG feature representation and its annotation is formed for efficient emotion decoding, which is implemented within the semi-supervised framework to better capture the properties of both the labeled and unlabeled EEG data. The experimental results on three publicly available datasets, SEED-IV, SEED-V and MPED, depict that: 1) CoAdapt achieves better emotion recognition performance in comparison with some related models; 2) the improvements of interclass separability and label margin are empirically evaluated, indicating the effectiveness of the purified EEG feature representation and rectified emotion annotation; and 3) some task-related results are identified from data-driven perspective, including the emotion carry-over effect and the discriminative spatial patterns in emotion decoding.
AbstractList Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the inherent EEG characteristics such as nonstationarity and weakness, combined with the possible limited immersion and carry-over effect of subjects during data collection experiments, may cause that the semantic meaning of extracted EEG feature vector cannot well match its annotated emotional state, dubbed the 'feature-label inconsistency dilemma in EEG-based emotion decoding. To this end, this article proposes to alleviate the side effect of feature-label inconsistency from both feature and label aspects. On the one hand, we explore more meaningful emotion-related EEG representation by the latent low-rank representation (LRR). On the other hand, we enhance the correspondence between the explored EEG representation and its annotated emotional state by a label dragging strategy. As a result, a collaborative adaptation (CoAdapt) model between latent EEG feature representation and its annotation is formed for efficient emotion decoding, which is implemented within the semi-supervised framework to better capture the properties of both the labeled and unlabeled EEG data. The experimental results on three publicly available datasets, SEED-IV, SEED-V and MPED, depict that: 1) CoAdapt achieves better emotion recognition performance in comparison with some related models; 2) the improvements of interclass separability and label margin are empirically evaluated, indicating the effectiveness of the purified EEG feature representation and rectified emotion annotation; and 3) some task-related results are identified from data-driven perspective, including the emotion carry-over effect and the discriminative spatial patterns in emotion decoding.
Author Peng, Yong
Gong, Xiaoxiao
Chen, Yuxin
Zhang, Pengfei
Cichocki, Andrzej
Fang, Jinglong
Author_xml – sequence: 1
  givenname: Xiaoxiao
  orcidid: 0009-0004-5306-6991
  surname: Gong
  fullname: Gong, Xiaoxiao
  email: bettygxx@hdu.edu.cn
  organization: School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
– sequence: 2
  givenname: Yuxin
  orcidid: 0009-0002-5798-2239
  surname: Chen
  fullname: Chen, Yuxin
  email: 244050098@hdu.edu.cn
  organization: School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
– sequence: 3
  givenname: Pengfei
  orcidid: 0009-0000-9322-7228
  surname: Zhang
  fullname: Zhang, Pengfei
  email: 22031122@hdu.edu.cn
  organization: School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
– sequence: 4
  givenname: Yong
  orcidid: 0000-0003-1208-972X
  surname: Peng
  fullname: Peng, Yong
  email: yongpeng@hdu.edu.cn
  organization: School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
– sequence: 5
  givenname: Jinglong
  orcidid: 0000-0002-3560-4926
  surname: Fang
  fullname: Fang, Jinglong
  email: fjl@hdu.edu.cn
  organization: School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China
– sequence: 6
  givenname: Andrzej
  orcidid: 0000-0002-8364-7226
  surname: Cichocki
  fullname: Cichocki, Andrzej
  email: cichockiand@gmail.com
  organization: Systems Research Institute of Polish Academy of Sciences, Warszawa, Poland
BookMark eNpFkEtPwzAQhC1UJNrCnQMH_4GU9St2uJWQlkpFSKicI8fZoKDWrpwA4t-TPiROOzuamcM3ISMfPBJyy2DGGGT3m9XLjANXM6EyMNxckDFTSidZmvIRGQMwk2RSpVdk0nWfAKBTqcck5GFe233_QPOw3doqRNu330iP5iCDp4_Y_yB6urY9-p4WxZIu0PZfEekb7iN2g3tKWl_Tuffh_DYh0mIXjvoJXahb_3FNLhu77fDmfKfkfVFs8udk_bpc5fN14piQfcJdVWmbKYFcWwvcgaxAWG6MU1LoRtQIHLSVrLFSZYo3pnZ1wyxLK1NJIaYETrsuhq6L2JT72O5s_C0ZlAdg5QCsPAArz8CGyt2p0iLif5yB0ZIZ8QcTL2pS
CODEN IEIMAO
Cites_doi 10.1016/j.bspc.2023.104998
10.1109/TCSII.2022.3163141
10.1109/TCYB.2025.3550191
10.1016/j.neucom.2023.126262
10.1177/10888683221083398
10.1038/s41598-024-52205-1
10.1109/TCSS.2023.3314508
10.1016/j.bspc.2024.106912
10.1109/JTEHM.2023.3320132
10.1109/TAFFC.2021.3064940
10.1109/TMM.2021.3121567
10.1109/TAFFC.2022.3210441
10.1109/SMC54092.2024.10832104
10.1016/j.ins.2021.04.058
10.1177/1948550620923229
10.1109/TPAMI.2012.88
10.3389/fnins.2018.00162
10.1109/TCDS.2024.3391131
10.3389/fncom.2022.942979
10.1109/TIM.2022.3165741
10.1109/tnnls.2024.3493425
10.1109/TNSRE.2019.2904708
10.1016/j.ins.2018.04.063
10.1109/TAFFC.2023.3288885
10.1016/j.bspc.2022.104389
10.1016/j.jksuci.2023.03.014
10.1109/tcds.2024.3470248
10.1109/TAFFC.2022.3189222
10.1109/TCYB.2021.3060804
10.1007/s11063-014-9396-z
10.3389/fnhum.2020.00173
10.1109/ACCESS.2019.2891579
10.1109/JPROC.2023.3277471
10.1109/TCSVT.2023.3275299
10.1109/TNNLS.2012.2212721
10.1016/j.eij.2019.10.002
10.1109/TNNLS.2017.2648880
10.1109/TNSRE.2024.3389037
10.1007/s00521-019-04688-7
10.1093/scan/nsz078
10.1038/nn.4468
10.1109/TFUZZ.2024.3435390
10.1109/TAFFC.2017.2712143
10.1109/TCDS.2022.3175008
10.24963/ijcai.2017/211
10.1038/srep25826
10.1109/ICCV.2011.6126422
10.1016/j.bbe.2020.02.002
10.1109/TCYB.2018.2797176
10.1109/TIM.2020.3006611
10.1504/IJDMB.2019.100629
10.1088/1741-2552/ad3c28
10.1145/3524499
10.1109/TAFFC.2022.3199075
10.1109/TNSRE.2022.3175464
10.1016/j.bspc.2024.106877
10.1088/1741-2552/ac49a7
10.3389/fnins.2020.615435
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
DOI 10.1109/TIM.2025.3590828
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Physics
EISSN 1557-9662
EndPage 16
ExternalDocumentID 10_1109_TIM_2025_3590828
11087418
Genre orig-research
GrantInformation_xml – fundername: Ministry of Education of the People's Republic of China (MoE) Humanities and Social Sciences Project
  grantid: 24YJCZH225
  funderid: 10.13039/100013546
– fundername: General Natural Science Project of Fuyang Normal University
  grantid: 2021FSKJ1500
  funderid: 10.13039/501100012404
– fundername: National Key Research and Development Program of China
  grantid: 2023YFE0114900
  funderid: 10.13039/501100012166
– fundername: Key Laboratory of Embedded System and Services Computing of the Ministry of Education
  grantid: ESSCKF2024-11
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
85S
8WZ
97E
A6W
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACIWK
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IAAWW
IBMZZ
ICLAB
IDIHD
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
TN5
TWZ
VH1
VJK
AAYXX
CITATION
RIG
ID FETCH-LOGICAL-c134t-2cbb7a953e27aa02c04b03a288c5437f3de0207a41fa45952f8dcdf1a16b8b433
IEDL.DBID RIE
ISSN 0018-9456
IngestDate Wed Aug 06 19:11:28 EDT 2025
Wed Aug 27 01:44:38 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c134t-2cbb7a953e27aa02c04b03a288c5437f3de0207a41fa45952f8dcdf1a16b8b433
ORCID 0009-0004-5306-6991
0009-0002-5798-2239
0000-0003-1208-972X
0000-0002-3560-4926
0000-0002-8364-7226
0009-0000-9322-7228
PageCount 16
ParticipantIDs crossref_primary_10_1109_TIM_2025_3590828
ieee_primary_11087418
PublicationCentury 2000
PublicationDate 20250000
2025-00-00
PublicationDateYYYYMMDD 2025-01-01
PublicationDate_xml – year: 2025
  text: 20250000
PublicationDecade 2020
PublicationTitle IEEE transactions on instrumentation and measurement
PublicationTitleAbbrev TIM
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref57
ref12
ref56
ref15
ref14
ref58
ref53
ref52
ref11
ref55
ref10
ref54
ref17
ref16
ref19
ref18
ref51
ref50
ref46
ref45
ref48
ref47
ref42
ref41
ref44
ref43
ref49
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref40
ref35
ref34
ref37
ref36
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
ref24
ref23
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
References_xml – ident: ref40
  doi: 10.1016/j.bspc.2023.104998
– ident: ref49
  doi: 10.1109/TCSII.2022.3163141
– ident: ref53
  doi: 10.1109/TCYB.2025.3550191
– ident: ref39
  doi: 10.1016/j.neucom.2023.126262
– ident: ref48
  doi: 10.1177/10888683221083398
– ident: ref11
  doi: 10.1038/s41598-024-52205-1
– ident: ref9
  doi: 10.1109/TCSS.2023.3314508
– ident: ref42
  doi: 10.1016/j.bspc.2024.106912
– ident: ref58
  doi: 10.1109/JTEHM.2023.3320132
– ident: ref52
  doi: 10.1109/TAFFC.2021.3064940
– ident: ref19
  doi: 10.1109/TMM.2021.3121567
– ident: ref38
  doi: 10.1109/TAFFC.2022.3210441
– ident: ref57
  doi: 10.1109/SMC54092.2024.10832104
– ident: ref30
  doi: 10.1016/j.ins.2021.04.058
– ident: ref47
  doi: 10.1177/1948550620923229
– ident: ref13
  doi: 10.1109/TPAMI.2012.88
– ident: ref32
  doi: 10.3389/fnins.2018.00162
– ident: ref1
  doi: 10.1109/TCDS.2024.3391131
– ident: ref56
  doi: 10.3389/fncom.2022.942979
– ident: ref35
  doi: 10.1109/TIM.2022.3165741
– ident: ref37
  doi: 10.1109/tnnls.2024.3493425
– ident: ref22
  doi: 10.1109/TNSRE.2019.2904708
– ident: ref33
  doi: 10.1016/j.ins.2018.04.063
– ident: ref54
  doi: 10.1109/TAFFC.2023.3288885
– ident: ref15
  doi: 10.1016/j.bspc.2022.104389
– ident: ref27
  doi: 10.1016/j.jksuci.2023.03.014
– ident: ref55
  doi: 10.1109/tcds.2024.3470248
– ident: ref8
  doi: 10.1109/TAFFC.2022.3189222
– ident: ref26
  doi: 10.1109/TCYB.2021.3060804
– ident: ref29
  doi: 10.1007/s11063-014-9396-z
– ident: ref14
  doi: 10.3389/fnhum.2020.00173
– ident: ref51
  doi: 10.1109/ACCESS.2019.2891579
– ident: ref4
  doi: 10.1109/JPROC.2023.3277471
– ident: ref28
  doi: 10.1109/TCSVT.2023.3275299
– ident: ref23
  doi: 10.1109/TNNLS.2012.2212721
– ident: ref5
  doi: 10.1016/j.eij.2019.10.002
– ident: ref24
  doi: 10.1109/TNNLS.2017.2648880
– ident: ref41
  doi: 10.1109/TNSRE.2024.3389037
– ident: ref20
  doi: 10.1007/s00521-019-04688-7
– ident: ref45
  doi: 10.1093/scan/nsz078
– ident: ref12
  doi: 10.1038/nn.4468
– ident: ref36
  doi: 10.1109/TFUZZ.2024.3435390
– ident: ref50
  doi: 10.1109/TAFFC.2017.2712143
– ident: ref25
  doi: 10.1109/TCDS.2022.3175008
– ident: ref34
  doi: 10.24963/ijcai.2017/211
– ident: ref46
  doi: 10.1038/srep25826
– ident: ref18
  doi: 10.1109/ICCV.2011.6126422
– ident: ref3
  doi: 10.1016/j.bbe.2020.02.002
– ident: ref31
  doi: 10.1109/TCYB.2018.2797176
– ident: ref2
  doi: 10.1109/TIM.2020.3006611
– ident: ref16
  doi: 10.1504/IJDMB.2019.100629
– ident: ref44
  doi: 10.1088/1741-2552/ad3c28
– ident: ref10
  doi: 10.1145/3524499
– ident: ref6
  doi: 10.1109/TAFFC.2022.3199075
– ident: ref7
  doi: 10.1109/TNSRE.2022.3175464
– ident: ref17
  doi: 10.1016/j.bspc.2024.106877
– ident: ref43
  doi: 10.1088/1741-2552/ac49a7
– ident: ref21
  doi: 10.3389/fnins.2020.615435
SSID ssj0007647
Score 2.4239497
Snippet Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the...
SourceID crossref
ieee
SourceType Index Database
Publisher
StartPage 1
SubjectTerms Annotations
Brain modeling
Collaborative adaptation (CoAdapt)
Data mining
Decoding
electroencephalogram (EEG)-based emotion recognition
Electroencephalography
Emotion recognition
Feature extraction
feature-label inconsistency
label dragging
latent low-rank representation (LRR)
Matrix decomposition
Noise
Sparse matrices
Title CoAdapt: Collaborative Adaptation Between Latent EEG Feature Representation and Annotation for Emotion Decoding
URI https://ieeexplore.ieee.org/document/11087418
Volume 74
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELZoJSQYeJQiykseWBiSJrHzYislpSDaAbVSt8iOLwuSU5WEgV-P7aS0ICGxJZYjWb7L3Xf23XcI3TDIiSM8ZglBc4sqKVuRcrOWcp1MAQZwOdXVyJNpMJ7T54W_aIrVTS0MAJjkM7D1o7nLF0VW6aOyvk5Z12wrLdRSelYXa32b3TCgNUGmq_5gBQvWd5JO3J89TVQk6Pk2MR2-ox8-aKupivEpo0M0Xa-mTiV5s6uS29nnL6LGfy_3CB006BIPanU4RjsgO2h_i3Owg3ZNzmf2foKKYTEQbFne4eFGGT4Am0EjMHxfZ3HhF4VIZYmT5BFrzFitAL-aFNqmckliJgUeSFk0rwoK46TuEIQfVISrPWQXzUfJbDi2mv4LVuYSWlpexnnIYp-AFzLmeJlDuUOYF0WZT0mYEwEKbIaMujmjfux7eSQykbvMDXjEKSGnqC0LCWcI6x7rmnUUnBiU1XA4h9hlggWhB8pqiB66XUskXdY0G6kJT5w4VdJLtfTSRno91NV7vZnXbPP5H-MXaE9_Xp-bXKJ2uargSiGJkl8bDfoCC_XFFQ
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED5BEQIGHqWI8vTAwpCSxE6TsJXS0kLbAbVSt8iOLwtSUpWUgV-P7aS0ICGxJZYVWb7L3Xf23XcANxwTakuXW1KyxGJKylag3KylXCdXgAEdwXQ18nDU7E3Y89SblsXqphYGEU3yGTb0o7nLl1m80EdldzplXbOtbMKWcvzMK8q1vg2v32QFRaaj_mEFDJa3knZ4N-4PVSzoeg1qenwHP7zQWlsV41W6BzBarqdIJnlrLHLRiD9_UTX-e8GHsF_iS9IqFOIINjCtwt4a62AVtk3WZ_x-DFk7a0k-y-9Je6UOH0jMoBEZeSjyuMhAYdI0J53OE9GocTFH8mqSaMvapZTwVJJWmmblqwLDpFP0CCKPKsbVPrIGk25n3O5ZZQcGK3Yoyy03FsLnoUfR9Tm33dhmwqbcDYLYY9RPqEQFN33OnIQzL_TcJJCxTBzuNEUgGKUnUEmzFE-B6C7rmncU7RCV3bCFwNDhkjd9F5XdkHW4XUokmhVEG5EJUOwwUtKLtPSiUnp1qOm9Xs0rt_nsj_Fr2OmNh4No0B-9nMOu_lRxinIBlXy-wEuFK3JxZbTpC4sYyGI
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=CoAdapt%3A+Collaborative+Adaptation+Between+Latent+EEG+Feature+Representation+and+Annotation+for+Emotion+Decoding&rft.jtitle=IEEE+transactions+on+instrumentation+and+measurement&rft.au=Gong%2C+Xiaoxiao&rft.au=Chen%2C+Yuxin&rft.au=Zhang%2C+Pengfei&rft.au=Peng%2C+Yong&rft.date=2025&rft.pub=IEEE&rft.issn=0018-9456&rft.volume=74&rft.spage=1&rft.epage=16&rft_id=info:doi/10.1109%2FTIM.2025.3590828&rft.externalDocID=11087418
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9456&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9456&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9456&client=summon