DAGAM: a domain adversarial graph attention model for subject-independent EEG-based emotion recognition

Objective. Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person’s data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition...

Full description

Saved in:
Bibliographic Details
Published inJournal of neural engineering Vol. 20; no. 1; pp. 16022 - 16031
Main Authors Xu, Tao, Dang, Wang, Wang, Jiabao, Zhou, Yun
Format Journal Article
LanguageEnglish
Published England IOP Publishing 01.02.2023
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Objective. Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person’s data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition may result in poor generalization performance when compared to the subject-independent approach. However, existing studies have attempted but have not fully utilized EEG’s topology, nor have they solved the problem caused by the difference in data distribution between the source and target domains. Approach. To eliminate individual differences in EEG signals, this paper proposes the domain adversarial graph attention model, a novel EEG-based emotion recognition model. The basic idea is to generate a graph using biological topology to model multichannel EEG signals. Graph theory can topologically describe and analyze EEG channel relationships and mutual dependencies. Then, unlike other graph convolutional networks, self-attention pooling is used to benefit from the extraction of salient EEG features from the graph, effectively improving performance. Finally, following graph pooling, the domain adversarial model based on the graph is used to identify and handle EEG variation across subjects, achieving good generalizability efficiently. Main Results. We conduct extensive evaluations on two benchmark data sets (SEED and SEED IV) and obtain cutting-edge results in subject-independent emotion recognition. Our model boosts the SEED accuracy to 92.59% (4.06% improvement) with the lowest standard deviation (STD) of 3.21% (2.46% decrements) and SEED IV accuracy to 80.74% (6.90% improvement) with the lowest STD of 4.14% (3.88% decrements), respectively. The computational complexity is drastically reduced in comparison to similar efforts (33 times lower). Significance. We have developed a model that significantly reduces the computation time while maintaining accuracy, making EEG-based emotion decoding more practical and generalizable.
AbstractList Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person's data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition may result in poor generalization performance when compared to the subject-independent approach. However, existing studies have attempted but have not fully utilized EEG's topology, nor have they solved the problem caused by the difference in data distribution between the source and target domains. To eliminate individual differences in EEG signals, this paper proposes the domain adversarial graph attention model, a novel EEG-based emotion recognition model. The basic idea is to generate a graph using biological topology to model multichannel EEG signals. Graph theory can topologically describe and analyze EEG channel relationships and mutual dependencies. Then, unlike other graph convolutional networks, self-attention pooling is used to benefit from the extraction of salient EEG features from the graph, effectively improving performance. Finally, following graph pooling, the domain adversarial model based on the graph is used to identify and handle EEG variation across subjects, achieving good generalizability efficiently. We conduct extensive evaluations on two benchmark data sets (SEED and SEED IV) and obtain cutting-edge results in subject-independent emotion recognition. Our model boosts the SEED accuracy to 92.59% (4.06% improvement) with the lowest standard deviation (STD) of 3.21% (2.46% decrements) and SEED IV accuracy to 80.74% (6.90% improvement) with the lowest STD of 4.14% (3.88% decrements), respectively. The computational complexity is drastically reduced in comparison to similar efforts (33 times lower). We have developed a model that significantly reduces the computation time while maintaining accuracy, making EEG-based emotion decoding more practical and generalizable.
Objective.Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person's data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition may result in poor generalization performance when compared to the subject-independent approach. However, existing studies have attempted but have not fully utilized EEG's topology, nor have they solved the problem caused by the difference in data distribution between the source and target domains.Approach.To eliminate individual differences in EEG signals, this paper proposes the domain adversarial graph attention model, a novel EEG-based emotion recognition model. The basic idea is to generate a graph using biological topology to model multichannel EEG signals. Graph theory can topologically describe and analyze EEG channel relationships and mutual dependencies. Then, unlike other graph convolutional networks, self-attention pooling is used to benefit from the extraction of salient EEG features from the graph, effectively improving performance. Finally, following graph pooling, the domain adversarial model based on the graph is used to identify and handle EEG variation across subjects, achieving good generalizability efficiently.Main Results.We conduct extensive evaluations on two benchmark data sets (SEED and SEED IV) and obtain cutting-edge results in subject-independent emotion recognition. Our model boosts the SEED accuracy to 92.59% (4.06% improvement) with the lowest standard deviation (STD) of 3.21% (2.46% decrements) and SEED IV accuracy to 80.74% (6.90% improvement) with the lowest STD of 4.14% (3.88% decrements), respectively. The computational complexity is drastically reduced in comparison to similar efforts (33 times lower).Significance.We have developed a model that significantly reduces the computation time while maintaining accuracy, making EEG-based emotion decoding more practical and generalizable.Objective.Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person's data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition may result in poor generalization performance when compared to the subject-independent approach. However, existing studies have attempted but have not fully utilized EEG's topology, nor have they solved the problem caused by the difference in data distribution between the source and target domains.Approach.To eliminate individual differences in EEG signals, this paper proposes the domain adversarial graph attention model, a novel EEG-based emotion recognition model. The basic idea is to generate a graph using biological topology to model multichannel EEG signals. Graph theory can topologically describe and analyze EEG channel relationships and mutual dependencies. Then, unlike other graph convolutional networks, self-attention pooling is used to benefit from the extraction of salient EEG features from the graph, effectively improving performance. Finally, following graph pooling, the domain adversarial model based on the graph is used to identify and handle EEG variation across subjects, achieving good generalizability efficiently.Main Results.We conduct extensive evaluations on two benchmark data sets (SEED and SEED IV) and obtain cutting-edge results in subject-independent emotion recognition. Our model boosts the SEED accuracy to 92.59% (4.06% improvement) with the lowest standard deviation (STD) of 3.21% (2.46% decrements) and SEED IV accuracy to 80.74% (6.90% improvement) with the lowest STD of 4.14% (3.88% decrements), respectively. The computational complexity is drastically reduced in comparison to similar efforts (33 times lower).Significance.We have developed a model that significantly reduces the computation time while maintaining accuracy, making EEG-based emotion decoding more practical and generalizable.
Objective. Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person’s data would be inaccurate when applied to another person for emotion recognition. Thus, the subject-dependent approach for emotion recognition may result in poor generalization performance when compared to the subject-independent approach. However, existing studies have attempted but have not fully utilized EEG’s topology, nor have they solved the problem caused by the difference in data distribution between the source and target domains. Approach. To eliminate individual differences in EEG signals, this paper proposes the domain adversarial graph attention model, a novel EEG-based emotion recognition model. The basic idea is to generate a graph using biological topology to model multichannel EEG signals. Graph theory can topologically describe and analyze EEG channel relationships and mutual dependencies. Then, unlike other graph convolutional networks, self-attention pooling is used to benefit from the extraction of salient EEG features from the graph, effectively improving performance. Finally, following graph pooling, the domain adversarial model based on the graph is used to identify and handle EEG variation across subjects, achieving good generalizability efficiently. Main Results. We conduct extensive evaluations on two benchmark data sets (SEED and SEED IV) and obtain cutting-edge results in subject-independent emotion recognition. Our model boosts the SEED accuracy to 92.59% (4.06% improvement) with the lowest standard deviation (STD) of 3.21% (2.46% decrements) and SEED IV accuracy to 80.74% (6.90% improvement) with the lowest STD of 4.14% (3.88% decrements), respectively. The computational complexity is drastically reduced in comparison to similar efforts (33 times lower). Significance. We have developed a model that significantly reduces the computation time while maintaining accuracy, making EEG-based emotion decoding more practical and generalizable.
Author Wang, Jiabao
Dang, Wang
Zhou, Yun
Xu, Tao
Author_xml – sequence: 1
  givenname: Tao
  orcidid: 0000-0002-1721-561X
  surname: Xu
  fullname: Xu, Tao
  organization: Northwestern Polytechnical University , School of Software, Xi’an, People’s Republic of China
– sequence: 2
  givenname: Wang
  orcidid: 0000-0001-6795-9636
  surname: Dang
  fullname: Dang, Wang
  organization: Northwestern Polytechnical University , School of Software, Xi’an, People’s Republic of China
– sequence: 3
  givenname: Jiabao
  orcidid: 0000-0002-8889-3223
  surname: Wang
  fullname: Wang, Jiabao
  organization: Northwestern Polytechnical University , School of Software, Xi’an, People’s Republic of China
– sequence: 4
  givenname: Yun
  orcidid: 0000-0002-2306-8986
  surname: Zhou
  fullname: Zhou, Yun
  organization: Shaanxi Normal University , Faculty of Education, Xi’an, People’s Republic of China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/36548989$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1v1DAQxS1URD_gzgn5yKGhthN_hNuqLAtSERc4W7PxZPEqsYPtIPHfk3TbHpDgMjMa_d5I894lOQsxICGvOXvHmTE3XDe8ElKKG-gAmXpGLp5WZ0-zYufkMucjYzXXLXtBzmslG9Oa9oIcPmx2my_vKVAXR_CBgvuFKUPyMNBDgukHhVIwFB8DHaPDgfYx0Tzvj9iVygeHEy4lFLrd7qo9ZHQUx3jPJ-ziIfh1fkme9zBkfPXQr8j3j9tvt5-qu6-7z7ebu6qra1Uq3jAjNdcoHTBtgCkjmFCw50JoofuGKd1o4aRqGULv6laiEZ0SvBeml219Rd6e7k4p_pwxFzv63OEwQMA4Zyu0NJzpptYL-uYBnfcjOjslP0L6bR_NWQB1AroUc07Y284XWL8pCfxgObNrCna12a6W21MKi5D9JXy8_R_J9Uni42SPcU5hcenf-B-Vk5ac
CODEN JNEOBH
CitedBy_id crossref_primary_10_1109_TIM_2022_3204314
crossref_primary_10_1109_ACCESS_2024_3370431
crossref_primary_10_1109_ACCESS_2024_3412328
crossref_primary_10_1016_j_bspc_2024_106953
crossref_primary_10_1109_TAFFC_2024_3395359
crossref_primary_10_1109_JBHI_2022_3232497
crossref_primary_10_1109_LSP_2024_3410044
crossref_primary_10_1016_j_bspc_2024_106505
crossref_primary_10_1016_j_eswa_2023_121889
crossref_primary_10_1063_5_0231511
crossref_primary_10_1007_s11042_023_17018_w
crossref_primary_10_3389_fnins_2023_1274320
crossref_primary_10_1016_j_compeleceng_2025_110189
crossref_primary_10_1016_j_knosys_2024_112599
crossref_primary_10_1109_TIM_2025_3544334
crossref_primary_10_1109_TAFFC_2024_3433470
crossref_primary_10_7717_peerj_cs_2065
crossref_primary_10_1007_s12559_024_10358_1
crossref_primary_10_1016_j_neunet_2024_106643
Cites_doi 10.1186/s12911-017-0562-x
10.1109/TCDS.2020.2999337
10.1016/j.neucom.2013.06.046
10.1109/TNN.2010.2091281
10.1007/978-3-030-04221-9_25
10.1109/TAFFC.2018.2885474
10.3389/fnhum.2020.605246
10.1109/ACCESS.2019.2891579
10.1109/TNN.2008.2005605
10.1080/02699930126048
10.1109/TAFFC.2018.2817622
10.1109/TAFFC.2014.2339834
10.1109/TAMD.2015.2431497
10.1109/TCYB.2019.2904052
10.1109/TAFFC.2020.2994159
10.1155/2020/6929546
10.1109/LSP.2022.3179946
10.1609/aaai.v34i03.5657
10.1109/TPAMI.2016.2547397
10.1016/j.aiopen.2021.01.001
10.1109/TCYB.2018.2797176
10.1023/A:1018628609742
ContentType Journal Article
Copyright 2023 IOP Publishing Ltd
2023 IOP Publishing Ltd.
Copyright_xml – notice: 2023 IOP Publishing Ltd
– notice: 2023 IOP Publishing Ltd.
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1088/1741-2552/acae06
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE
MEDLINE - Academic
CrossRef
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Anatomy & Physiology
EISSN 1741-2552
ExternalDocumentID 36548989
10_1088_1741_2552_acae06
jneacae06
Genre Journal Article
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 62077036
  funderid: http://dx.doi.org/10.13039/501100001809
– fundername: The National Key Research and Development Program of China
  grantid: 2018AA000500
GroupedDBID ---
1JI
4.4
53G
5B3
5GY
5VS
5ZH
7.M
7.Q
AAGCD
AAJIO
AAJKP
AATNI
ABHWH
ABJNI
ABQJV
ABVAM
ACAFW
ACGFS
ACHIP
AEFHF
AENEX
AFYNE
AKPSB
ALMA_UNASSIGNED_HOLDINGS
AOAED
ASPBG
ATQHT
AVWKF
AZFZN
CEBXE
CJUJL
CRLBU
CS3
DU5
EBS
EDWGO
EMSAF
EPQRW
EQZZN
F5P
HAK
IHE
IJHAN
IOP
IZVLO
KOT
LAP
N5L
N9A
P2P
PJBAE
RIN
RO9
ROL
RPA
SY9
W28
XPP
AAYXX
ADEQX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
ID FETCH-LOGICAL-c336t-14085717e5da078a0682026ab122727f4067472d5690eafd395e82c621f28f593
IEDL.DBID IOP
ISSN 1741-2560
1741-2552
IngestDate Fri Jul 11 14:34:58 EDT 2025
Thu Jan 02 22:38:33 EST 2025
Thu Apr 24 22:52:53 EDT 2025
Tue Jul 01 01:48:09 EDT 2025
Wed Aug 21 03:34:31 EDT 2024
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords emotion recognition
subject independent
EEG
Language English
License This article is available under the terms of the IOP-Standard License.
2023 IOP Publishing Ltd.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c336t-14085717e5da078a0682026ab122727f4067472d5690eafd395e82c621f28f593
Notes JNE-105808.R1
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0002-2306-8986
0000-0002-8889-3223
0000-0002-1721-561X
0000-0001-6795-9636
PMID 36548989
PQID 2758107437
PQPubID 23479
PageCount 10
ParticipantIDs crossref_primary_10_1088_1741_2552_acae06
pubmed_primary_36548989
crossref_citationtrail_10_1088_1741_2552_acae06
iop_journals_10_1088_1741_2552_acae06
proquest_miscellaneous_2758107437
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-02-01
PublicationDateYYYYMMDD 2023-02-01
PublicationDate_xml – month: 02
  year: 2023
  text: 2023-02-01
  day: 01
PublicationDecade 2020
PublicationPlace England
PublicationPlace_xml – name: England
PublicationTitle Journal of neural engineering
PublicationTitleAbbrev JNE
PublicationTitleAlternate J. Neural Eng
PublicationYear 2023
Publisher IOP Publishing
Publisher_xml – name: IOP Publishing
References Duan (jneacae06bib21) 2013
Song (jneacae06bib32) 2019; 7
Li (jneacae06bib35) 2021; 13
Zhang (jneacae06bib12) 2020; vol 34
Gong (jneacae06bib31) 2012
Scarselli Fand (jneacae06bib1) 2009; 20
Ganin (jneacae06bib4) 2016; 17
Suykens (jneacae06bib28) 1999; 9
Zheng (jneacae06bib7) 2015; 7
Li (jneacae06bib36) 2022
Zhou (jneacae06bib17) 2020; 1
Chu (jneacae06bib27) 2017; 39
Zheng (jneacae06bib8) 2019; 49
Xu (jneacae06bib13) 2020; 2020
Li (jneacae06bib25) 2021; 12
Kipf (jneacae06bib24) 2017
Ribeiro (jneacae06bib37) 2016
Zhao (jneacae06bib19) 2021; vol 35
Chen (jneacae06bib14) 2017; 17
Shi (jneacae06bib9) 2013
Pan (jneacae06bib29) 2011; 22
Collobert (jneacae06bib33) 2006; 7
Kanamori (jneacae06bib26) 2009; 10
Luo (jneacae06bib6) 2018
Wang (jneacae06bib11) 2014; 129
Zheng (jneacae06bib22) 2015; 7
Jenke (jneacae06bib10) 2014; 5
Li (jneacae06bib15) 2020; 50
Zhong (jneacae06bib3) 2022; 13
Fernando (jneacae06bib30) 2013
Song (jneacae06bib2) 2020; 11
Lee (jneacae06bib23) 2019; vol 97
Li (jneacae06bib18) 2018
Bao (jneacae06bib5) 2021; 14
Schmidt (jneacae06bib20) 2001; 15
Li (jneacae06bib16) 2018; 12
Gao (jneacae06bib34) 2022; 29
References_xml – volume: 17
  start-page: 167
  year: 2017
  ident: jneacae06bib14
  article-title: Subject-independent emotion recognition based on physiological signals: a three-stage decision method
  publication-title: BMC Med. Inf. Decis. Making
  doi: 10.1186/s12911-017-0562-x
– volume: 13
  start-page: 354
  year: 2021
  ident: jneacae06bib35
  article-title: A novel Bi-hemispheric discrepancy model for EEG emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2020.2999337
– volume: 129
  start-page: 94
  year: 2014
  ident: jneacae06bib11
  article-title: Emotional state classification from EEG data using machine learning approach
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2013.06.046
– start-page: 344
  year: 2022
  ident: jneacae06bib36
  article-title: EEG emotion recognition based on dynamically organized graph neural network
– volume: 22
  start-page: 199
  year: 2011
  ident: jneacae06bib29
  article-title: Domain adaptation via transfer component analysis
  publication-title: IEEE Trans. Neural Netw.
  doi: 10.1109/TNN.2010.2091281
– start-page: 275
  year: 2018
  ident: jneacae06bib6
  article-title: WGAN domain adaptation for EEG-based emotion recognition
  doi: 10.1007/978-3-030-04221-9_25
– volume: 12
  start-page: 494
  year: 2021
  ident: jneacae06bib25
  article-title: A Bi-hemisphere domain adversarial neural network model for EEG emotion recognition
  publication-title: IEEE Trans. Affective Comput.
  doi: 10.1109/TAFFC.2018.2885474
– volume: vol 35
  start-page: pp 863
  year: 2021
  ident: jneacae06bib19
  article-title: Plug-and-play domain adaptation for cross-subject EEG-based emotion recognition
– volume: 14
  start-page: 620
  year: 2021
  ident: jneacae06bib5
  article-title: Two-level domain adaptation neural network for EEG-based emotion recognition
  publication-title: Front. Hum. Neurosci.
  doi: 10.3389/fnhum.2020.605246
– start-page: pp 81
  year: 2013
  ident: jneacae06bib21
  article-title: Differential entropy feature for EEG-based emotion classification 2013
– volume: 7
  start-page: 12177
  year: 2019
  ident: jneacae06bib32
  article-title: MPED: a multi-modal physiological emotion database for discrete emotion recognition
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2019.2891579
– start-page: pp 1135
  year: 2016
  ident: jneacae06bib37
  article-title: “Why Should I Trust You?”: explaining the predictions of any classifier
– start-page: pp 2066
  year: 2012
  ident: jneacae06bib31
  article-title: Geodesic flow kernel for unsupervised domain adaptation
– volume: 20
  start-page: 61
  year: 2009
  ident: jneacae06bib1
  article-title: The graph neural network model
  publication-title: IEEE Trans. Neural Netw.
  doi: 10.1109/TNN.2008.2005605
– volume: 15
  start-page: 487
  year: 2001
  ident: jneacae06bib20
  article-title: Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions
  publication-title: Cogn. Emot.
  doi: 10.1080/02699930126048
– start-page: pp 2960
  year: 2013
  ident: jneacae06bib30
  article-title: Unsupervised visual domain adaptation using subspace alignment
– volume: 11
  start-page: 532
  year: 2020
  ident: jneacae06bib2
  article-title: EEG emotion recognition using dynamical graph convolutional neural networks
  publication-title: IEEE Trans. Affective Comput.
  doi: 10.1109/TAFFC.2018.2817622
– start-page: pp 403
  year: 2018
  ident: jneacae06bib18
  article-title: Cross-subject emotion recognition using deep adaptation networks
– year: 2017
  ident: jneacae06bib24
  article-title: Semi-supervised classification with graph convolutional networks
– volume: 5
  start-page: 327
  year: 2014
  ident: jneacae06bib10
  article-title: Feature extraction and selection for emotion recognition from EEG
  publication-title: IEEE Trans. Affective Comput.
  doi: 10.1109/TAFFC.2014.2339834
– start-page: pp 6627
  year: 2013
  ident: jneacae06bib9
  article-title: Differential entropy feature for EEG-based vigilance estimation
– volume: 7
  start-page: 162
  year: 2015
  ident: jneacae06bib22
  article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev.
  doi: 10.1109/TAMD.2015.2431497
– volume: 10
  start-page: 1391
  year: 2009
  ident: jneacae06bib26
  article-title: A least-squares approach to direct importance estimation
  publication-title: J. Mach. Learn. Res.
– volume: 50
  start-page: 3281
  year: 2020
  ident: jneacae06bib15
  article-title: Multisource transfer learning for cross-subject EEG emotion recognition
  publication-title: IEEE Trans. Cybern.
  doi: 10.1109/TCYB.2019.2904052
– volume: 13
  start-page: 1290
  year: 2022
  ident: jneacae06bib3
  article-title: EEG-based emotion recognition using regularized graph neural networks
  publication-title: IEEE Trans. Affective Comput.
  doi: 10.1109/TAFFC.2020.2994159
– volume: 17
  start-page: 2096
  year: 2016
  ident: jneacae06bib4
  article-title: Domain-adversarial training of neural networks
  publication-title: J. Mach. Learn. Res.
– volume: 2020
  start-page: 1
  year: 2020
  ident: jneacae06bib13
  article-title: Decode brain system: a dynamic adaptive convolutional quorum voting approach for variable-length EEG data
  publication-title: Complexity
  doi: 10.1155/2020/6929546
– volume: 29
  start-page: 1574
  year: 2022
  ident: jneacae06bib34
  article-title: EEG-GCN: spatio-temporal and self-adaptive graph convolutional networks for single and multi-view EEG-based emotion recognition
  publication-title: IEEE Signal Process. Lett.
  doi: 10.1109/LSP.2022.3179946
– volume: 12
  start-page: 494
  year: 2018
  ident: jneacae06bib16
  article-title: A Bi-hemisphere domain adversarial neural network model for EEG emotion recognition
  publication-title: IEEE Trans. Affective Comput.
  doi: 10.1109/TAFFC.2018.2885474
– volume: vol 34
  start-page: pp 2709
  year: 2020
  ident: jneacae06bib12
  article-title: Variational pathway reasoning for EEG emotion recognition
  publication-title: Proc. Conf. AAAI Artificial Intelligence
  doi: 10.1609/aaai.v34i03.5657
– volume: 39
  start-page: 529
  year: 2017
  ident: jneacae06bib27
  article-title: Selective transfer machine for personalized facial expression analysis
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2016.2547397
– volume: 1
  start-page: 57
  year: 2020
  ident: jneacae06bib17
  article-title: Graph neural networks: a review of methods and applications
  publication-title: AI Open
  doi: 10.1016/j.aiopen.2021.01.001
– volume: 49
  start-page: 1110
  year: 2019
  ident: jneacae06bib8
  article-title: EmotionMeter: a multimodal framework for recognizing human emotions
  publication-title: IEEE Trans. Cybern.
  doi: 10.1109/TCYB.2018.2797176
– volume: 9
  start-page: 293
  year: 1999
  ident: jneacae06bib28
  article-title: Least squares support vector machine classifiers
  publication-title: Neural Process. Lett.
  doi: 10.1023/A:1018628609742
– volume: vol 97
  start-page: pp 3734
  year: 2019
  ident: jneacae06bib23
  article-title: Self-attention graph pooling
– volume: 7
  start-page: 162
  year: 2015
  ident: jneacae06bib7
  article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev.
  doi: 10.1109/TAMD.2015.2431497
– volume: 7
  start-page: 26
  year: 2006
  ident: jneacae06bib33
  article-title: Large scale transductive SVMs
  publication-title: J. Mach. Learn. Res.
SSID ssj0031790
Score 2.4745388
Snippet Objective. Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person’s...
Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person's data would...
Objective.Due to individual differences in electroencephalogram (EEG) signals, the learning model built by the subject-dependent technique from one person's...
SourceID proquest
pubmed
crossref
iop
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 16022
SubjectTerms EEG
Electroencephalography - methods
emotion recognition
Emotions
Humans
Learning
Recognition, Psychology
subject independent
Title DAGAM: a domain adversarial graph attention model for subject-independent EEG-based emotion recognition
URI https://iopscience.iop.org/article/10.1088/1741-2552/acae06
https://www.ncbi.nlm.nih.gov/pubmed/36548989
https://www.proquest.com/docview/2758107437
Volume 20
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LS8QwEA66Xrz4fqwvIqjgIbttYrupnhZdFcHHQcGDUJImlUVtF7d70F_vTNMtKCrirYdp00wmmW-SyTeE7HgW_HKqcPXzDDuQusOU0hFLlLQdm3Q83-KG_uVVeH53cHEf3E-Qo_ouTD6olv4WPDqiYKfCKiFOtgFD-wyQMG-rRFmk254SEhwn3t67vhkvwwKpp9xtSJQOveqM8rsvfPJJk9Duz3CzdDuns-Rh_MMu2-SpNSp0K3n_wuX4zx7NkZkKjtKuE50nEzZbIIvdDELxlze6R8sE0XLnfZE8nnTPupeHVFGTv6h-RhUWcx4qNGFaEl9TJOss0ydpWWGHAiKmw5HGvR7WrwvuFrTXO2PoPw21rowQrROZ8myJ3J32bo_PWVWngSVChAXzkSUNwkIbGAWIQ3khwAoeKu1zDvAoBcwAQQs3AUTiVqVGRIGVPAm5n3KZBpFYJo0sz-wqoZFOtY4MgMQEQimuIBhLUw2gRhjPGN80SXs8UnFSkZhjLY3nuDxMlzJGXcaoy9jpskn26zcGjsDjF9ldGKK4msXDX-S2x-YRw2zEIxaV2Xw0jDmEX5jiKjpNsuLspm5VhNClSEZrf2xlnUxjbXuXIr5BGsXryG4CAir0VmnpH6Fm_NQ
linkProvider IOP Publishing
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Nb9QwEB3RIiEutFBoty1gJEDi4N3EbrIJtxXdbflo6YFKvbl2bFcraLJis4fy65mxsyuBoELiloMdJ-OPeWOP3wN4mTj0y17T6pdYflCYIdfalLzShRu6apikjjb0T07z4_ODDxfZRadzGu7CNLNu6e_jYyQKjibsEuKKAWLolCMSFgNdaZfkg5n1a3A3k-g76Qbf57PlUiyJfireiKQaedKdU_7pLb_4pTVs---QM7ieyQZcLj86Zpx87S9a069-_Mbn-B9_tQkPOljKRrH4Q7jj6kewNaoxJL--Ya9ZSBQNO_BbcHU4OhqdvGWa2eZaT2umSdR5rmkos0CAzYi0M6RRsqC0wxAZs_nC0J4Pn66Ed1s2Hh9x8qOWuSgnxFYJTU39GM4n4y_vjnmn18ArKfOWp8SWhuGhy6xG5KGTHOGFyLVJhUCY5BE7YPAibIYRudPeyjJzhahykXpR-KyUT2C9bmq3A6w03pjSIlisMKQSGoMy7w2CG2kTa1Pbg8Gyt1TVkZmTpsY3FQ7Vi0KRPRXZU0V79uDNqsYsEnncUvYVdpPqZvP8lnIvlkNE4aykoxZdu2YxVwLDMEp1lcMebMexs2pV5vhLZVHu_mMrz-He2eFEfXp_-nEP7pPcfcwa34f19vvCPUVQ1JpnYeD_BN9bAkc
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=DAGAM%3A+a+domain+adversarial+graph+attention+model+for+subject-independent+EEG-based+emotion+recognition&rft.jtitle=Journal+of+neural+engineering&rft.au=Xu%2C+Tao&rft.au=Dang%2C+Wang&rft.au=Wang%2C+Jiabao&rft.au=Zhou%2C+Yun&rft.date=2023-02-01&rft.eissn=1741-2552&rft.volume=20&rft.issue=1&rft_id=info:doi/10.1088%2F1741-2552%2Facae06&rft_id=info%3Apmid%2F36548989&rft.externalDocID=36548989
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1741-2560&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1741-2560&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1741-2560&client=summon