STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and i...
Saved in:
Published in | Frontiers in human neuroscience Vol. 17; p. 1169949 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Research Foundation
13.04.2023
Frontiers Media S.A |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research. |
---|---|
AbstractList | Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research. Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research.Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research. |
Author | Pan, Jiahui Li, Jingcong Huang, Haiyun Wang, Fei Pan, Weijian |
AuthorAffiliation | School of Software, South China Normal University , Guangzhou , China |
AuthorAffiliation_xml | – name: School of Software, South China Normal University , Guangzhou , China |
Author_xml | – sequence: 1 givenname: Jingcong surname: Li fullname: Li, Jingcong – sequence: 2 givenname: Weijian surname: Pan fullname: Pan, Weijian – sequence: 3 givenname: Haiyun surname: Huang fullname: Huang, Haiyun – sequence: 4 givenname: Jiahui surname: Pan fullname: Pan, Jiahui – sequence: 5 givenname: Fei surname: Wang fullname: Wang, Fei |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37125349$$D View this record in MEDLINE/PubMed |
BookMark | eNp9Uk1vEzEQXaEi-gF_gANaiQuXDR57vWtzQVUVQqVKHBrO1qztTRx27eB1WvHvcZoUtT1w8nj83tPzzDsvTnzwtijeA5kxJuTn3q9344wSymYAjZS1fFWcQdPQikMDJ0_q0-J8mjaENLTh8KY4ZS1Qzmp5Voy3y8Xlcv6lvN1icjhUyY7bEHEoVxG36xJTsj654Etv032Iv8p7l3K7TBH91Ic42lhar4PJZ76W8_mi6nCyprRjeCBGq8PKu339tnjd4zDZd8fzovj5bb68-l7d_FhcX13eVLqWbaqsrpG1QjScMmKM6QCo5rJthWk5l6JDaanugHNCBOafMys4AJNd3zWsQXZRXB90TcCN2kY3YvyjAjr10AhxpTAmpwertGnAQtdL3ULdcSqQEUSjEZATLkzW-nrQ2u660Rqdx5HH80z0-Yt3a7UKdwoIMFa3JCt8OirE8Htnp6RGN2k7DOht2E2KCiIocMp5hn58Ad2EXfR5Voq2UkpoqISM-vDU0j8vj2vNAHEA6BimKdpeaZdwv4Hs0A3ZmtonSD0kSO0TpI4JylT6gvqo_h_SX1Cfyxo |
CitedBy_id | crossref_primary_10_1016_j_inffus_2024_102697 crossref_primary_10_3389_fnhum_2024_1464431 crossref_primary_10_3390_brainsci14040364 crossref_primary_10_1007_s00521_024_10821_y crossref_primary_10_1016_j_compbiomed_2024_108808 crossref_primary_10_1016_j_bspc_2024_106505 crossref_primary_10_1109_TNNLS_2023_3319315 crossref_primary_10_1016_j_neunet_2024_106111 crossref_primary_10_54392_irjmt2456 crossref_primary_10_7717_peerj_cs_2065 crossref_primary_10_1049_rsn2_12484 crossref_primary_10_3390_s25051293 crossref_primary_10_3390_brainsci13091293 crossref_primary_10_3390_s24010077 crossref_primary_10_1109_ACCESS_2023_3329678 crossref_primary_10_1016_j_bspc_2024_106985 crossref_primary_10_1016_j_compbiomed_2024_108973 crossref_primary_10_1109_JBHI_2024_3395622 crossref_primary_10_1016_j_bspc_2024_106323 crossref_primary_10_1016_j_neunet_2024_106624 crossref_primary_10_4236_jbbs_2024_144009 crossref_primary_10_1007_s11042_024_20119_9 crossref_primary_10_1142_S0129065725500029 crossref_primary_10_1016_j_compeleceng_2025_110189 crossref_primary_10_3389_fnins_2024_1519970 crossref_primary_10_1109_ACCESS_2025_3530567 |
Cites_doi | 10.1587/transinf.2015EDP7251 10.1109/TBME.2012.2217495 10.1109/TAFFC.2017.2712143 10.1088/1741-2552/ab0ab5 10.1016/j.irbm.2021.04.004 10.1016/j.biopsycho.2004.03.008 10.1023/B:NEAB.0000038139.39812.eb 10.1016/j.patcog.2017.10.013 10.1145/3474085.3475697 10.3389/fnins.2020.622759 10.1016/j.neulet.2006.04.006 10.1109/TAFFC.2017.2714671 10.1515/bmt-2019-0306 10.1109/TAMD.2015.2431497 10.48550/arXiv.2002.11867 10.1109/TCDS.2020.2999337 10.3390/brainsci10100687 10.14569/IJACSA.2017.081046 10.1016/j.intcom.2005.10.006 10.1007/s00779-011-0479-9 10.48550/arXiv.1609.02907 10.48550/arXiv.1710.10903 10.1023/A:1015075101937 10.1109/ICASSP.2003.1202279 10.3390/s21113786 10.1007/s10916-008-9231-z 10.1109/HSI.2013.6577880 10.1109/TAFFC.2019.2937768 10.1016/j.neucom.2017.01.012 10.1109/JBHI.2022.3198688 10.1109/CSPA.2011.5759912 10.1109/TAFFC.2022.3169001 10.1109/TSMCB.2005.854502 10.1016/j.entcs.2019.04.009 10.1109/TAFFC.2018.2817622 10.1016/j.compbiomed.2004.05.001 10.1007/978-3-030-04221-9_36 10.3390/s23031255 10.4414/smw.2013.13786 10.3389/fnhum.2020.00089 10.3758/s13428-014-0500-0 10.1109/TNNLS.2020.2978386 10.1016/B978-0-12-801851-4.00001-X 10.48550/arXiv.1412.6980 10.1016/0013-4694(70)90143-4 10.1016/j.procs.2018.04.056 10.1007/978-3-662-43790-2_11 10.1109/TNN.2010.2091281 10.1109/TCYB.2018.2797176 10.1016/j.inffus.2017.02.003 10.3390/s18072074 10.1109/TAFFC.2020.2994159 10.1016/j.eij.2019.10.002 10.1109/ICASSP43922.2022.9746600 10.3390/s20072034 10.1155/2020/8875426 10.1109/JBHI.2017.2688239 10.1109/TAFFC.2014.2339834 10.20944/preprints202301.0156.v1 10.3390/s20185083 10.1016/j.bspc.2018.01.015 10.1109/TASLP.2021.3095662 10.3389/fpsyg.2017.01454 10.3389/fnhum.2018.00521 |
ContentType | Journal Article |
Copyright | Copyright © 2023 Li, Pan, Huang, Pan and Wang. 2023. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. Copyright © 2023 Li, Pan, Huang, Pan and Wang. 2023 Li, Pan, Huang, Pan and Wang |
Copyright_xml | – notice: Copyright © 2023 Li, Pan, Huang, Pan and Wang. – notice: 2023. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: Copyright © 2023 Li, Pan, Huang, Pan and Wang. 2023 Li, Pan, Huang, Pan and Wang |
DBID | AAYXX CITATION NPM 3V. 7XB 88I 8FE 8FH 8FK ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO GNUQQ HCIFZ LK8 M2P M7P PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
DOI | 10.3389/fnhum.2023.1169949 |
DatabaseName | CrossRef PubMed ProQuest Central (Corporate) ProQuest Central (purchase pre-March 2016) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Journals ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One ProQuest Central ProQuest Central Student SciTech Premium Collection ProQuest Biological Science Collection Science Database Biological Science Database (ProQuest) ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Natural Science Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences Natural Science Collection ProQuest Central Korea Biological Science Collection ProQuest Central (New) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition Biological Science Database ProQuest SciTech Collection ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | PubMed CrossRef Publicly Available Content Database MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
EISSN | 1662-5161 |
ExternalDocumentID | oai_doaj_org_article_cd61e1bf9c714b528a30aadca1a5058d PMC10133470 37125349 10_3389_fnhum_2023_1169949 |
Genre | Journal Article |
GroupedDBID | --- 29H 2WC 53G 5GY 5VS 88I 8FE 8FH 9T4 AAFWJ AAYXX ABIVO ABUWG ACGFO ACGFS ACXDI ADBBV ADRAZ AEGXH AENEX AFKRA AFPKN AIAGR ALMA_UNASSIGNED_HOLDINGS AOIJS AZQEC BAWUL BBNVY BCNDV BENPR BHPHI BPHCQ CCPQU CITATION CS3 DIK DU5 DWQXO E3Z EMOBN F5P GNUQQ GROUPED_DOAJ GX1 HCIFZ HYE KQ8 LK8 M2P M48 M7P M~E O5R O5S OK1 OVT PGMZT PHGZM PHGZT PIMPY PQQKQ PROAC RNS RPM TR2 C1A IPNFZ NPM PQGLB RIG 3V. 7XB 8FK PKEHL PQEST PQUKI PRINS Q9U 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c497t-ec4a378865230dddb112c59778d75598ba9e2cb155008a6993e851139bfb636a3 |
IEDL.DBID | M48 |
ISSN | 1662-5161 |
IngestDate | Wed Aug 27 01:28:08 EDT 2025 Thu Aug 21 18:38:01 EDT 2025 Fri Jul 11 05:46:28 EDT 2025 Fri Jul 25 11:42:26 EDT 2025 Mon Jul 21 06:17:35 EDT 2025 Thu Apr 24 23:04:57 EDT 2025 Tue Jul 01 02:43:11 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | deep learning EEG-based emotion classification graph neural network EEG transformer encoder |
Language | English |
License | Copyright © 2023 Li, Pan, Huang, Pan and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c497t-ec4a378865230dddb112c59778d75598ba9e2cb155008a6993e851139bfb636a3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 Edited by: Redha Taiar, Université de Reims Champagne-Ardenne, France This article was submitted to Brain-Computer Interfaces, a section of the journal Frontiers in Human Neuroscience Reviewed by: Mohammad Ashraful Amin, North South University, Bangladesh; Jinyi Long, Jinan University, China |
OpenAccessLink | https://www.proquest.com/docview/2799916291?pq-origsite=%requestingapplication% |
PMID | 37125349 |
PQID | 2799916291 |
PQPubID | 4424408 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_cd61e1bf9c714b528a30aadca1a5058d pubmedcentral_primary_oai_pubmedcentral_nih_gov_10133470 proquest_miscellaneous_2808215255 proquest_journals_2799916291 pubmed_primary_37125349 crossref_citationtrail_10_3389_fnhum_2023_1169949 crossref_primary_10_3389_fnhum_2023_1169949 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-04-13 |
PublicationDateYYYYMMDD | 2023-04-13 |
PublicationDate_xml | – month: 04 year: 2023 text: 2023-04-13 day: 13 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Lausanne |
PublicationTitle | Frontiers in human neuroscience |
PublicationTitleAlternate | Front Hum Neurosci |
PublicationYear | 2023 |
Publisher | Frontiers Research Foundation Frontiers Media S.A |
Publisher_xml | – name: Frontiers Research Foundation – name: Frontiers Media S.A |
References | Li (B38) 2009 Zhong (B77) 2020; 13 Huang (B27) 2022; 43 Ding (B17) 2020 Collobert (B14) 2006; 7 He (B25) 2022 Li (B39) 2021 Wioleta (B68) 2013 Zheng (B74) 2018; 49 Egger (B19) 2019; 343 Jia (B31) 2020 Raganato (B52) 2018 Balasubramanian (B7) 2018; 42 Jerritta (B30) 2011 Feng (B20) 2022; 26 Moontaha (B46) 2023; 23 Hadjidimitriou (B23) 2012; 59 Micheloyannis (B45) 2006; 402 Craik (B15) 2019; 16 Santurkar (B54) 2018 Newson (B47) 2019; 12 Rahman (B53) 2020; 21 Chen (B12) 2020 Ozdemir (B48) 2021; 66 Shi (B57) 2013 Akin (B3) 2002; 26 Hjorth (B26) 1970; 29 He (B24) 2020; 10 Song (B59) 2018; 11 Tyng (B65) 2017; 8 Jenke (B28) 2014; 5 Subha (B61) 2010; 34 Poria (B51) 2017; 37 Torres (B64) 2020; 20 Zhang (B73) 2020; 14 Stancin (B60) 2021; 21 Kıymık (B35) 2005; 35 Zhang (B72) 2019; 13 Sartipi (B55) 2021 Lew (B36) 2020 Li (B40) 2021 Cimtay (B13) 2020; 20 Gabert-Quillen (B21) 2015; 47 Van den Broek (B66) 2013; 17 Aftanas (B2) 2004; 34 Li (B37) 2018 Thammasan (B63) 2016; 99 Brave (B8) 2007 Gu (B22) 2018; 77 Pan (B49) 2010; 22 Yang (B71) 2020; 14 Alarcao (B4) 2017; 10 Shu (B58) 2018; 18 Castellano (B11) 2008 Kingma (B33) 2014 Xu (B70) 2018; 130 Ding (B18) 2021 Jeon (B29) 2017 Davidson (B16) 2004; 67 Wu (B69) 2020; 32 Alhagry (B5) 2017; 8 Cambria (B10) 2017 Kipf (B34) 2016 Peter (B50) 2006; 18 Brosch (B9) 2013; 143 Liu (B43) 2022 Schuller (B56) 2003 Zheng (B75) 2015; 7 Suhaimi (B62) 2020; 2020 Liu (B44) 2014 Anderson (B6) 2006; 36 Li (B41) 2020; 13 Veličkovi,ć (B67) 2017 Katsigiannis (B32) 2017; 22 Liu (B42) 2021; 29 Zheng (B76) 2017; 10 Abdel-Hamid (B1) 2023; 23 |
References_xml | – volume: 99 start-page: 1234 year: 2016 ident: B63 article-title: Continuous music-emotion recognition based on electroencephalogram publication-title: IEICE Trans. Inf. Syst doi: 10.1587/transinf.2015EDP7251 – start-page: 116 year: 2022 ident: B43 article-title: “Spatial-temporal transformers for eeg emotion recognition,” publication-title: 2022 The 6th International Conference on Advances in Artificial Intelligence – volume: 59 start-page: 3498 year: 2012 ident: B23 article-title: Toward an eeg-based recognition of music liking using time-frequency analysis publication-title: IEEE Trans. Biomed. Eng doi: 10.1109/TBME.2012.2217495 – volume: 10 start-page: 417 year: 2017 ident: B76 article-title: Identifying stable patterns over time for emotion recognition from eeg publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2017.2712143 – volume: 16 start-page: 031001 year: 2019 ident: B15 article-title: Deep learning for electroencephalogram (EEG) classification tasks: a review publication-title: J. Neural Eng doi: 10.1088/1741-2552/ab0ab5 – volume: 43 start-page: 107 year: 2022 ident: B27 article-title: Classification of motor imagery eeg based on time-domain and frequency-domain dual-stream convolutional neural network publication-title: IRBM doi: 10.1016/j.irbm.2021.04.004 – volume: 67 start-page: 219 year: 2004 ident: B16 article-title: What does the prefrontal cortex “do” in affect: perspectives on frontal eeg asymmetry research publication-title: Biol. Psychol doi: 10.1016/j.biopsycho.2004.03.008 – start-page: 1 volume-title: 2021 International Joint Conference on Neural Networks (IJCNN) year: 2021 ident: B40 article-title: “Attention-based spatio-temporal graphic lstm for eeg emotion recognition,” – volume: 34 start-page: 859 year: 2004 ident: B2 article-title: Analysis of evoked eeg synchronization and desynchronization in conditions of emotional activation in humans: temporal and topographic characteristics publication-title: Neurosci. Behav. Physiol doi: 10.1023/B:NEAB.0000038139.39812.eb – volume: 77 start-page: 354 year: 2018 ident: B22 article-title: Recent advances in convolutional neural networks publication-title: Pattern Recognit doi: 10.1016/j.patcog.2017.10.013 – year: 2018 ident: B52 article-title: “An analysis of encoder representations in transformer-based machine translation,” publication-title: Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP – start-page: 5565 year: 2021 ident: B39 article-title: “A multi-domain adaptive graph convolutional network for eeg-based emotion recognition,” publication-title: Proceedings of the 29th ACM International Conference on Multimedia doi: 10.1145/3474085.3475697 – volume: 14 start-page: 622759 year: 2020 ident: B73 article-title: An investigation of deep learning models for eeg-based emotion recognition publication-title: Front. Neurosci doi: 10.3389/fnins.2020.622759 – volume: 402 start-page: 273 year: 2006 ident: B45 article-title: Using graph theoretical analysis of multi channel EEG to evaluate the neural efficiency hypothesis publication-title: Neurosci. Lett doi: 10.1016/j.neulet.2006.04.006 – volume: 10 start-page: 374 year: 2017 ident: B4 article-title: Emotions recognition using eeg signals: a survey publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2017.2714671 – volume: 66 start-page: 43 year: 2021 ident: B48 article-title: Eeg-based emotion recognition with deep convolutional neural networks publication-title: Biomed. Eng doi: 10.1515/bmt-2019-0306 – volume: 7 start-page: 162 year: 2015 ident: B75 article-title: Investigating critical frequency bands and channels for eeg-based emotion recognition with deep neural networks publication-title: IEEE Trans. Auton. Ment. Dev doi: 10.1109/TAMD.2015.2431497 – year: 2020 ident: B12 article-title: Bridging the gap between spatial and spectral domains: A survey on graph neural networks publication-title: arXiv preprint doi: 10.48550/arXiv.2002.11867 – volume: 13 start-page: 354 year: 2020 ident: B41 article-title: A novel bi-hemispheric discrepancy model for EEG emotion recognition publication-title: IEEE Trans. Cogn. Dev. Syst doi: 10.1109/TCDS.2020.2999337 – volume: 10 start-page: 687 year: 2020 ident: B24 article-title: Advances in multimodal emotion recognition based on brain-computer interfaces publication-title: Brain Sci doi: 10.3390/brainsci10100687 – volume: 8 start-page: 355 year: 2017 ident: B5 article-title: Emotion recognition based on eeg using lstm recurrent neural network publication-title: Emotion doi: 10.14569/IJACSA.2017.081046 – volume: 18 start-page: 139 year: 2006 ident: B50 article-title: Emotion representation and physiology assignments in digital systems publication-title: Interact Comput doi: 10.1016/j.intcom.2005.10.006 – volume: 17 start-page: 53 year: 2013 ident: B66 article-title: Ubiquitous emotion-aware computing publication-title: Pers. Ubiquit. Comput doi: 10.1007/s00779-011-0479-9 – year: 2016 ident: B34 article-title: Semi-supervised classification with graph convolutional networks publication-title: arXiv preprint doi: 10.48550/arXiv.1609.02907 – year: 2017 ident: B67 article-title: Graph attention networks publication-title: arXiv preprint doi: 10.48550/arXiv.1710.10903 – volume: 26 start-page: 241 year: 2002 ident: B3 article-title: Comparison of wavelet transform and fft methods in the analysis of eeg signals publication-title: J. Med. Syst doi: 10.1023/A:1015075101937 – volume-title: 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP'03), volume 2 year: 2003 ident: B56 article-title: “Hidden markov model-based speech emotion recognition,” doi: 10.1109/ICASSP.2003.1202279 – volume: 21 start-page: 3786 year: 2021 ident: B60 article-title: A review of eeg signal features and their application in driver drowsiness detection systems publication-title: Sensors doi: 10.3390/s21113786 – year: 2018 ident: B54 article-title: “How does batch normalization help optimization?” publication-title: Advances in Neural Information Processing Systems, Vol. 31 – volume: 34 start-page: 195 year: 2010 ident: B61 article-title: EEG signal analysis: a survey publication-title: J. Med. Syst doi: 10.1007/s10916-008-9231-z – start-page: 556 volume-title: 2013 6th International Conference on Human System Interactions (HSI) year: 2013 ident: B68 article-title: “Using physiological signals for emotion recognition,” doi: 10.1109/HSI.2013.6577880 – volume: 13 start-page: 379 year: 2019 ident: B72 article-title: Gcb-net: Graph convolutional broad network and its application in emotion recognition publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2019.2937768 – volume: 7 start-page: 1687 year: 2006 ident: B14 article-title: Large scale transductive svms publication-title: J. Mach. Learn. Res doi: 10.1016/j.neucom.2017.01.012 – volume: 26 start-page: 5406 year: 2022 ident: B20 article-title: Eeg-based emotion recognition using spatial-temporal graph convolutional lstm with attention mechanism publication-title: IEEE J. Biomed. Health Inform doi: 10.1109/JBHI.2022.3198688 – volume-title: Emotion Recognition Through Multiple Modalities: Face, Body Gesture, Speech year: 2008 ident: B11 – start-page: 410 volume-title: 2011 IEEE 7th International Colloquium on Signal Processing and its Applications year: 2011 ident: B30 article-title: “Physiological signals based human emotion recognition: a review,” doi: 10.1109/CSPA.2011.5759912 – year: 2021 ident: B18 article-title: Tsception: capturing temporal dynamics and spatial asymmetry from eeg for emotion recognition publication-title: arXiv preprint doi: 10.1109/TAFFC.2022.3169001 – start-page: 116 volume-title: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine &Biology Society (EMBC) year: 2020 ident: B36 article-title: “EEG-based emotion recognition using spatial-temporal representation via bi-gru,” – volume: 36 start-page: 96 year: 2006 ident: B6 article-title: A real-time automated system for the recognition of human facial expressions publication-title: IEEE Trans. Syst. Man Cybern. B doi: 10.1109/TSMCB.2005.854502 – volume: 343 start-page: 35 year: 2019 ident: B19 article-title: Emotion recognition from physiological signal analysis: a review publication-title: Electron. Notes Theor. Comput. Sci doi: 10.1016/j.entcs.2019.04.009 – start-page: 1324 year: 2020 ident: B31 article-title: “Graphsleepnet: adaptive spatial-temporal graph convolutional networks for sleep stage classification,” publication-title: IJCAI – volume: 11 start-page: 532 year: 2018 ident: B59 article-title: Eeg emotion recognition using dynamical graph convolutional neural networks publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2018.2817622 – volume: 35 start-page: 603 year: 2005 ident: B35 article-title: Comparison of stft and wavelet transform methods in determining epileptic seizure activity in eeg signals for real-time application publication-title: Comput. Biol. Med doi: 10.1016/j.compbiomed.2004.05.001 – start-page: 403 volume-title: Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap year: 2018 ident: B37 article-title: “Cross-subject emotion recognition using deep adaptation networks,” doi: 10.1007/978-3-030-04221-9_36 – volume: 23 start-page: 1255 year: 2023 ident: B1 article-title: An efficient machine learning-based emotional valence recognition approach towards wearable eeg publication-title: Sensors doi: 10.3390/s23031255 – volume: 143 start-page: w13786 year: 2013 ident: B9 article-title: The impact of emotion on perception, attention, memory, and decision-making publication-title: Swiss Med. Wkly doi: 10.4414/smw.2013.13786 – volume: 14 start-page: 89 year: 2020 ident: B71 article-title: High gamma band eeg closely related to emotion: evidence from functional network publication-title: Front. Hum. Neurosci doi: 10.3389/fnhum.2020.00089 – volume: 47 start-page: 773 year: 2015 ident: B21 article-title: Ratings for emotion film clips publication-title: Behav. Res. Methods doi: 10.3758/s13428-014-0500-0 – start-page: 571 volume-title: 2021 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) year: 2021 ident: B55 article-title: “EEG emotion recognition via graph-based spatio-temporal attention neural networks,” – volume: 32 start-page: 4 year: 2020 ident: B69 article-title: A comprehensive survey on graph neural networks publication-title: IEEE Trans. Neural Netw. Learn. Syst doi: 10.1109/TNNLS.2020.2978386 – start-page: 3 year: 2017 ident: B29 article-title: “Emotions and affect in human factors and human-computer interaction: taxonomy, theories, approaches, and methods,” publication-title: Emotions and Affect in Human Factors and Human-Computer Interaction doi: 10.1016/B978-0-12-801851-4.00001-X – start-page: 103 volume-title: The Human-Computer Interaction Handbook year: 2007 ident: B8 article-title: “Emotion in human-computer interaction,” – start-page: 6627 volume-title: 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) year: 2013 ident: B57 article-title: “Differential entropy feature for EEG-based vigilance estimation,” – start-page: 1 volume-title: 2020 International Joint Conference on Neural Networks (IJCNN) year: 2020 ident: B17 article-title: “Tsception: a deep learning framework for emotion detection using EEG,” – year: 2014 ident: B33 article-title: Adam: a method for stochastic optimization publication-title: arXiv preprint doi: 10.48550/arXiv.1412.6980 – volume: 29 start-page: 306 year: 1970 ident: B26 article-title: Eeg analysis based on time domain properties publication-title: Electroencephalogr. Clin. Neurophysiol doi: 10.1016/0013-4694(70)90143-4 – volume: 130 start-page: 376 year: 2018 ident: B70 article-title: Learning emotions eeg-based recognition and brain activity: a survey study on bci for intelligent tutoring system publication-title: Procedia Comput. Sci doi: 10.1016/j.procs.2018.04.056 – volume-title: Real-Time Subject-Dependent EEG-Based Emotion Recognition Algorithm year: 2014 ident: B44 doi: 10.1007/978-3-662-43790-2_11 – volume: 22 start-page: 199 year: 2010 ident: B49 article-title: Domain adaptation via transfer component analysis publication-title: IEEE Trans. Neural Networks doi: 10.1109/TNN.2010.2091281 – volume: 49 start-page: 1110 year: 2018 ident: B74 article-title: Emotionmeter: a multimodal framework for recognizing human emotions publication-title: IEEE Trans. Cybern doi: 10.1109/TCYB.2018.2797176 – volume: 37 start-page: 98 year: 2017 ident: B51 article-title: A review of affective computing: from unimodal analysis to multimodal fusion publication-title: Inf. Fusion doi: 10.1016/j.inffus.2017.02.003 – volume: 18 start-page: 2074 year: 2018 ident: B58 article-title: A review of emotion recognition using physiological signals publication-title: Sensors doi: 10.3390/s18072074 – volume: 13 start-page: 1290 year: 2020 ident: B77 article-title: Eeg-based emotion recognition using regularized graph neural networks publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2020.2994159 – volume-title: Affective Computing and Sentiment Analysis year: 2017 ident: B10 – volume: 21 start-page: 23 year: 2020 ident: B53 article-title: Employing pca and t-statistical approach for feature extraction and classification of emotion from multichannel eeg signal publication-title: Egyptian Inform. J doi: 10.1016/j.eij.2019.10.002 – start-page: 3214 volume-title: ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) year: 2022 ident: B25 article-title: “Joint temporal convolutional networks and adversarial discriminative domain adaptation for EEG-based cross-subject emotion recognition,” doi: 10.1109/ICASSP43922.2022.9746600 – volume: 20 start-page: 2034 year: 2020 ident: B13 article-title: Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset eeg emotion recognition publication-title: Sensors doi: 10.3390/s20072034 – volume: 2020 start-page: 8875426 year: 2020 ident: B62 article-title: EEG-based emotion recognition: a state-of-the-art review of current trends and opportunities publication-title: Comput. Intell. Neurosci doi: 10.1155/2020/8875426 – volume: 22 start-page: 98 year: 2017 ident: B32 article-title: Dreamer: a database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices publication-title: IEEE J. Biomed. Health Inform doi: 10.1109/JBHI.2017.2688239 – volume: 5 start-page: 327 year: 2014 ident: B28 article-title: Feature extraction and selection for emotion recognition from EEG publication-title: IEEE Trans. Affect. Comput doi: 10.1109/TAFFC.2014.2339834 – start-page: 1223 volume-title: 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society year: 2009 ident: B38 article-title: “Emotion classification based on gamma-band EEG,” – volume: 23 start-page: 2387 year: 2023 ident: B46 article-title: Online learning for wearable eeg-based emotion classification publication-title: Sensors doi: 10.20944/preprints202301.0156.v1 – volume: 20 start-page: 5083 year: 2020 ident: B64 article-title: EEG-based bci emotion recognition: a survey publication-title: Sensors doi: 10.3390/s20185083 – volume: 42 start-page: 115 year: 2018 ident: B7 article-title: Music induced emotion using wavelet packet decomposition–an EEG study publication-title: Biomed. Signal Process. Control doi: 10.1016/j.bspc.2018.01.015 – volume: 29 start-page: 2351 year: 2021 ident: B42 article-title: Tera: Self-supervised learning of transformer encoder representation for speech publication-title: IEEE/ACM Trans. Audio Speech Lang. Process doi: 10.1109/TASLP.2021.3095662 – volume: 8 start-page: 1454 year: 2017 ident: B65 article-title: The influences of emotion on learning and memory publication-title: Front. Psychol doi: 10.3389/fpsyg.2017.01454 – volume: 12 start-page: 521 year: 2019 ident: B47 article-title: Eeg frequency bands in psychiatric disorders: a review of resting state studies publication-title: Front. Hum. Neurosci doi: 10.3389/fnhum.2018.00521 |
SSID | ssj0062651 |
Score | 2.4737692 |
Snippet | Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 1169949 |
SubjectTerms | Accuracy Attention Brain research deep learning EEG EEG-based emotion classification Electroencephalography Emotions graph neural network Graph representations Human Neuroscience Human-computer interaction Memory Nervous system Neural networks Neurosciences Physiology transformer encoder |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT9wwELYqTlyqFmi75SEjoV6qiDh2EpvbghYQEly6SNwsvyIqgRfB7oF_z4ydjXYr1F56ipI4ku0Zz3zjjL8h5ChwhecfXdHIyhbCSlFI2UGw4q1UAaxhSNUbrm-ay1txdVffrZT6wpywTA-cJ-7Y-YYFZjvlWiZsXUnDS2O8M8yA85YerS_4vGUwlW0woPSa5SMyEIKp4y7eL_DYecXBRDRKIXPmihtKbP3vQcw_MyVXXM_5J_Kxx4x0nPv6mXwIcYtsjyPEy4-v9AdNWZxpe3ybPP6aXoynkxOKtYZBt4qee-qBJmpqinSaKcGRxpwATnEnlho6XyLY8EyR3NLDFW7pZHJRoKvzNOSKP3TIOZrFHXJ7PpmeXRZ9SYXCCdXOi-CEQQb5BjeDvfcW4JZDCjrpW6Rqt0aFylmMW0ppYKZ4QEjGle1swxvDv5CNOIvhG6HGd661tgQlaEVdecPqjgnGnCuFq3g9Imw5w9r1fONY9uJBQ9yBUtFJKhqlonupjMjP4ZunzLbx19anKLihJTJlpwegP7rXH_0v_RmRvaXYdb98X3TVJtxcKTYih8NrWHj4N8XEMFtAGwnoCatHwUi_Zi0ZesJbwI0ceyjX9Getq-tv4u_7RO4NJpJz0Zbf_8fgdskmThj-_WJ8j2zMnxdhH0DU3B6k9fIG5egc4A priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELagvXBBQHksLchIiAuKGsdOYnNBW5S2QqJCsJV6s_xKi9QmZbt76L9nxnFCF6GeoiSO5HjG42_G428IeR-4wvOPLqtkYTNhpcikbMFZ8VaqANYwxOoN306q41Px9aw8SwG3m5RWOdrEaKh97zBGvl_UEcoUin2-_p1h1SjcXU0lNB6SbTDBEpyv7YPm5PuP0RYDWi_ZcFQGXDG133YXazx-XnAwFZVSyKB5ZzmKrP3_g5r_ZkzeWYIOn5DHCTvS-SDsp-RB6J6RnXkHfvPVLf1AYzZnDJPvkKufi6P5ovlEseYw6FiWOKguaaSopkirGRMdaTckglOMyFJDVyOSDUuKJJcernBLm-YowyXP0zBU_qFT7lHfPSenh83iy3GWSitkTqh6lQUnDDLJVxgU9t5bgF0Oqeikr5Gy3RoVCmfRf8mlgZHiAaEZV7a1Fa8Mf0G2ur4Lrwg1vnW1tTkoQy3KwhtWtkww5lwuXMHLGWHjCGuXeMex_MWlBv8DpaKjVDRKRSepzMjH6ZvrgXXj3tYHKLipJTJmxwf98lynCaidr1hgtlWuZsKWhTQ8N8Y7wwyAQOlnZG8Uu07T-Eb_VboZeTe9hgmIuyqmC_0a2khAUVhFCv705aAlU094DfiRYw_lhv5sdHXzTffrIpJ8g6nkXNT56_v7tUse4VDg_hbje2RrtVyHNwCTVvZtmgt_AMK1FNU priority: 102 providerName: ProQuest |
Title | STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition |
URI | https://www.ncbi.nlm.nih.gov/pubmed/37125349 https://www.proquest.com/docview/2799916291 https://www.proquest.com/docview/2808215255 https://pubmed.ncbi.nlm.nih.gov/PMC10133470 https://doaj.org/article/cd61e1bf9c714b528a30aadca1a5058d |
Volume | 17 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dixMxEA_n3Ysvop4f1bNEEF9kdbPJbhJBpCd7PYQ7RFvo25KvvRN6W60teP-9M9kPrJyCL122m5ZsZibzm2TyG0JeBK7x_KNLCpXZRFglEqVqCFa8VTrAbBhi9Yaz8-J0Lj4u8sUe6csddQP448bQDutJzdfL1z-_X78Hg3-HESf42zd1c7nFQ-UZhwmg0FroW-QAPJNEQz0Tw64CYPdYjpEVBQRgAHXaQzR_-Y8dRxX5_G8CoX_mUv7mnE7ukjsdqqSTVg3ukb3Q3CeHkwYi6qtr-pLGPM-4gH5Irr7MppNZ-ZZiNWLQvqRjp1rSSF5NkXAzpkDSpk0Rp7hWSw3d9Bg3rCnSX3q4wi0ty2mCztDT0NYEokNW0qp5QOYn5ezDadIVXUic0HKTBCcMcswXuFzsvbcAyByS1CkvkczdGh0yZzGySZWBkeIBQRvXtrYFLwx_SPabVRMeE2p87aS1KaiJFHnmDctrJhhzLhUu4_mIsH6EK9cxkmNhjGUFkQlKpYpSqVAqVSeVEXk1_OZby8fxz9bHKLihJXJpxy9W64uqM83K-YIFZmvtJBM2z5ThqTHeGWYAHio_Ike92KteP6tMRmSdaTYiz4fHYJq432KasNpCGwX4CutLwZs-arVk6AmXgCw59lDt6M9OV3efNF8vI_03TKKcC5k--a-heEpu4y1uhDF-RPY36214BnhqY8fk4Lg8__R5HNcj4HO6YONoOL8AaQshXw |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELbK9gAXBJTHlgJGAi4oahw7iYOE0BbSbmm7QrCVegt-hVZqk7LdFeqf4jcy4zzoItRbT1FiJ3I84_E39vgbQl45nuH5RxMkMtKB0FIEUpbgrFgtMwfW0PnsDQeTZHwoPh_FRyvkd3cWBsMqO5voDbWtDa6Rb0aphzJRxj6c_wwwaxTurnYpNBq12HOXv8Blu3i_-wnk-zqKtvPpx3HQZhUIjMjSeeCMUEiinuB6qLVWA-IwyMImbYps5VplLjIaoXsoVQLzt0NUwjNd6oQnisN3b5FVwcGVGZDVrXzy5Wtn-8E7iFlzNAdcv2yzrI4XeNw94mCa4FPI2Hll-vNZAv4Hbf-N0Lwy5W3fI3dbrEpHjXLdJyuuekDWRhX46WeX9A310aN-WX6NnH2b7oym-TuKOY5Bp4OW8-qUekpsijSePrCSVk3gOcUVYKrovEPObkaRVNPCFW5pnu8EOMVa6ppMQ7SPdaqrh-TwRjr9ERlUdeWeEKpsaVKtQ1C-VMSRVSwumWDMmFCYiMdDwroeLkzLc47pNk4L8HdQKoWXSoFSKVqpDMnb_p3zhuXj2tpbKLi-JjJ0-wf17EfRDvjC2IQ5psvMpEzoOJKKh0pZo5gC0CntkGx0Yi9as3FR_FXyIXnZF8OAx10cVbl6AXUkoDbMWgV_-rjRkr4lPAW8yrGFckl_lpq6XFKdHHtScTDNnIs0XL--XS_I7fH0YL_Y353sPSV3sFtwb43xDTKYzxbuGUC0uX7ejgtKvt_0UPwDWUpQLA |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1bb9MwFLZGJyFeEDAuhQFGAl5Q1NjOxUFCqGPpNgbVBJ20t-BbGNKWjK4V2l_j13GOc2FFaG97qtI4keNz8Xfs4-8Q8tKJDM8_miCRXAeRllEgZQnBitUyc-ANna_e8Hma7B5GH4_iozXyuzsLg2mVnU_0jtrWBtfIRzz1UIZnbFS2aREH25P3Zz8DrCCFO61dOY1GRfbdxS8I387f7W2DrF9xPslnH3aDtsJAYKIsXQTORAoJ1RNcG7XWakAfBhnZpE2RuVyrzHGjEcaHUiUwlztEKCLTpU5EogS89wZZTyEqCgdkfSufHnzp5gGIFGLWHNOBMDAbldXxEo--cwFuCl6F7J2XpkJfMeB_MPffbM1L09_kDrnd4lY6bhTtLllz1T2yMa4gZj-9oK-pzyT1S_Qb5PTrbGc8y99SrHcM-h20_Fcn1NNjU6T09EmWtGqS0CmuBlNFFx2KdnOKBJsWfuGS5vlOgNOtpa6pOkT7vKe6uk8Or2XQH5BBVVfuEaHKlibVOgRFTKOYW8XikkWMGRNGhot4SFg3woVpOc-x9MZJAbEPSqXwUilQKkUrlSF50z9z1jB-XNl6CwXXt0S2bv9HPf9etMZfGJswx3SZmZRFOuZSiVApaxRTAEClHZLNTuxF60LOi78KPyQv-ttg_LijoypXL6GNBASHFazgSx82WtL3RKSAXQX2UK7oz0pXV-9UP449wTi4aSGiNHx8db-ek5tggsWnven-E3ILRwW32ZjYJIPFfOmeAlpb6GetWVDy7bot8Q_g4FRh |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=STGATE%3A+Spatial-temporal+graph+attention+network+with+a+transformer+encoder+for+EEG-based+emotion+recognition&rft.jtitle=Frontiers+in+human+neuroscience&rft.au=Li%2C+Jingcong&rft.au=Pan%2C+Weijian&rft.au=Huang%2C+Haiyun&rft.au=Pan%2C+Jiahui&rft.date=2023-04-13&rft.issn=1662-5161&rft.eissn=1662-5161&rft.volume=17&rft_id=info:doi/10.3389%2Ffnhum.2023.1169949&rft.externalDBID=n%2Fa&rft.externalDocID=10_3389_fnhum_2023_1169949 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-5161&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-5161&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-5161&client=summon |