Adaptive Spatial–Temporal Aware Graph Learning for EEG-Based Emotion Recognition
An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in...
Saved in:
Published in | Cyborg and Bionic Systems Vol. 5; p. 0088 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
American Association for the Advancement of Science (AAAS)
01.01.2024
AAAS |
Subjects | |
Online Access | Get full text |
ISSN | 2692-7632 2097-1087 2692-7632 |
DOI | 10.34133/cbsystems.0088 |
Cover
Loading…
Abstract | An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph’s adaptive nature is crucial in optimizing spatial–temporal graph convolutions, which in turn enhances spatial–temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model’s efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition. |
---|---|
AbstractList | An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph’s adaptive nature is crucial in optimizing spatial–temporal graph convolutions, which in turn enhances spatial–temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model’s efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition. An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph's adaptive nature is crucial in optimizing spatial-temporal graph convolutions, which in turn enhances spatial-temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model's efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition.An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph's adaptive nature is crucial in optimizing spatial-temporal graph convolutions, which in turn enhances spatial-temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model's efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition. |
Author | Weishan Ye Jiyuan Wang Lifei Dai Lin Chen Zhen Liang Zhe Sun |
AuthorAffiliation | 4 Faculty of Health Data Science and Faculty of Medicine , Juntendo University , Tokyo, Japan 3 School of Clinical Medicine , Harbin Medical University , Harbin, China 5 International Health Science Innovation Center, Medical School , Shenzhen University , Shenzhen, China 1 School of Biomedical Engineering, Medical School , Shenzhen University , Shenzhen, China 2 Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging , Shenzhen, China |
AuthorAffiliation_xml | – name: 4 Faculty of Health Data Science and Faculty of Medicine , Juntendo University , Tokyo, Japan – name: 3 School of Clinical Medicine , Harbin Medical University , Harbin, China – name: 5 International Health Science Innovation Center, Medical School , Shenzhen University , Shenzhen, China – name: 1 School of Biomedical Engineering, Medical School , Shenzhen University , Shenzhen, China – name: 2 Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging , Shenzhen, China |
Author_xml | – sequence: 1 givenname: Weishan surname: Ye fullname: Ye, Weishan organization: School of Biomedical Engineering, Medical School, Shenzhen University, Shenzhen, China., Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen, China – sequence: 2 givenname: Jiyuan surname: Wang fullname: Wang, Jiyuan organization: School of Biomedical Engineering, Medical School, Shenzhen University, Shenzhen, China., Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen, China – sequence: 3 givenname: Lin surname: Chen fullname: Chen, Lin organization: School of Biomedical Engineering, Medical School, Shenzhen University, Shenzhen, China., Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen, China – sequence: 4 givenname: Lifei surname: Dai fullname: Dai, Lifei organization: School of Clinical Medicine, Harbin Medical University, Harbin, China – sequence: 5 givenname: Zhe surname: Sun fullname: Sun, Zhe organization: Faculty of Health Data Science and Faculty of Medicine, Juntendo University, Tokyo, Japan – sequence: 6 givenname: Zhen orcidid: 0000-0002-1749-2975 surname: Liang fullname: Liang, Zhen organization: School of Biomedical Engineering, Medical School, Shenzhen University, Shenzhen, China., Guangdong Provincial Key Laboratory of Biomedical Measurements and Ultrasound Imaging, Shenzhen, China., International Health Science Innovation Center, Medical School, Shenzhen University, Shenzhen, China |
BackLink | https://cir.nii.ac.jp/crid/1871147691541208448$$DView record in CiNii https://www.ncbi.nlm.nih.gov/pubmed/40391296$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kc1u1DAUhSNURMvQNTuUBQs20_r6J7ZXaKiGodJISKWsLce5mRolcbAzRd3xDrwhT4KnU0YtCzb2lX3Od319XhZHQxiwKF4DOWMcGDt3dbpLE_bpjBClnhUntNJ0LitGjx7Vx8VpSr4mvBIVCElfFMecMA1UVyfF1aKx4-Rvsfwy2snb7vfPX9fYjyHarlz8sBHLVbTjTblGGwc_bMo2xHK5XM0_2IRNuezD5MNQXqELm8Hv6lfF89Z2CU8f9lnx9ePy-uLTfP15dXmxWM-dEGKaQ9soLsFqVbd5GC1aizVasOgogKxkrS2pRSUEp7IBBYRT1uZFQgsNATYrLvfcJthvZoy-t_HOBOvN_UGIG2Pj5F2HpkVe0RaFa2rFG4daa9BOc6YI1xQws97vWeO27jErhil_wBPo05vB35hNuDVAiZKasEx490CI4fsW02R6nxx2nR0wbJNhlAhFicyJzIo3j5sduvxNJQvO9wIXQ0oR24MEiLmP3hyiN7vos0P843B-srs08mt99x_f271v8D5bdisoCcBlpUHw3XScK_YHHwbAkA |
CitedBy_id | crossref_primary_10_1016_j_aei_2024_102831 crossref_primary_10_3390_bios15030180 crossref_primary_10_1038_s41598_025_92563_y crossref_primary_10_3390_s25051594 crossref_primary_10_3389_fncom_2025_1581047 |
Cites_doi | 10.1088/1741-2560/8/5/056001 10.1109/TNN.2010.2091281 10.1109/ICCV.2013.368 10.26599/BSA.2020.9050026 10.1109/TAFFC.2020.2994159 10.1109/ACCESS.2019.2891579 10.1023/A:1010933404324 10.1609/aaai.v32i1.11496 10.1109/TCYB.2017.2788081 10.1109/TITS.2017.2740427 10.1109/TCDS.2020.2999337 10.1109/TAMD.2015.2431497 10.1109/TCDS.2020.3007453 10.1109/TCDS.2016.2587290 10.4310/SII.2009.v2.n3.a8 10.1109/TAFFC.2020.3013711 10.1007/978-3-030-36708-4_3 10.1109/TNSRE.2023.3236434 10.1016/S0003-2670(01)95359-0 10.1109/TCDS.2021.3098842 10.1109/EMBC46164.2021.9630277 10.1109/TCDS.2019.2949306 10.1109/TAFFC.2018.2817622 10.1609/aaai.v30i1.10306 10.1109/TAFFC.2018.2885474 10.1007/978-3-030-04221-9_25 10.1109/TCYB.2018.2797176 10.1007/s11063-018-9829-1 10.1177/2096595819896200 10.1109/TAFFC.2019.2922912 10.1109/CVPR.2012.6247911 10.1007/978-3-030-04221-9_36 10.1109/CVPR.2018.00835 |
ContentType | Journal Article |
Copyright | Copyright © 2024 Weishan Ye et al. Copyright © 2024 Weishan Ye et al. Copyright © 2024 Weishan Ye et al. 2024 Weishan Ye et al. |
Copyright_xml | – notice: Copyright © 2024 Weishan Ye et al. – notice: Copyright © 2024 Weishan Ye et al. – notice: Copyright © 2024 Weishan Ye et al. 2024 Weishan Ye et al. |
DBID | RYH AAYXX CITATION NPM 7X8 5PM DOA |
DOI | 10.34133/cbsystems.0088 |
DatabaseName | CiNii Complete CrossRef PubMed MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
DatabaseTitleList | CrossRef PubMed MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2692-7632 |
ExternalDocumentID | oai_doaj_org_article_fe462fe5cdb84dce99919c943804921e PMC12087903 40391296 10_34133_cbsystems_0088 |
Genre | Journal Article |
GrantInformation_xml | – fundername: ; grantid: 62276169, 62071310, and 82272114 – fundername: ; grantid: - – fundername: ; grantid: 2022SHIBS0003 – fundername: ; grantid: KCXFZ20201221173613036 |
GroupedDBID | ALMA_UNASSIGNED_HOLDINGS GROUPED_DOAJ M~E OK1 PGMZT RPM RYH AAYXX CITATION NPM 7X8 5PM |
ID | FETCH-LOGICAL-c555t-1fd8471a98bf13395faebea1aec211767b9a0b5655427d1810423f04271f1d013 |
IEDL.DBID | DOA |
ISSN | 2692-7632 2097-1087 |
IngestDate | Wed Aug 27 01:27:29 EDT 2025 Thu Aug 21 18:30:50 EDT 2025 Wed May 21 12:34:23 EDT 2025 Thu May 22 05:05:22 EDT 2025 Tue Jul 01 04:22:07 EDT 2025 Thu Apr 24 23:06:24 EDT 2025 Thu Jun 26 23:39:38 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | Copyright © 2024 Weishan Ye et al. Exclusive licensee Beijing Institute of Technology Press. No claim to original U.S. Government Works. Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c555t-1fd8471a98bf13395faebea1aec211767b9a0b5655427d1810423f04271f1d013 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ORCID | 0000-0002-1749-2975 |
OpenAccessLink | https://doaj.org/article/fe462fe5cdb84dce99919c943804921e |
PMID | 40391296 |
PQID | 3205820776 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_fe462fe5cdb84dce99919c943804921e pubmedcentral_primary_oai_pubmedcentral_nih_gov_12087903 proquest_miscellaneous_3205820776 pubmed_primary_40391296 crossref_primary_10_34133_cbsystems_0088 crossref_citationtrail_10_34133_cbsystems_0088 nii_cinii_1871147691541208448 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-01-01 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – month: 01 year: 2024 text: 2024-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States |
PublicationTitle | Cyborg and Bionic Systems |
PublicationTitleAlternate | Cyborg Bionic Syst |
PublicationYear | 2024 |
Publisher | American Association for the Advancement of Science (AAAS) AAAS |
Publisher_xml | – name: American Association for the Advancement of Science (AAAS) – name: AAAS |
References | Niu W (e_1_3_3_10_2) 2023; 31 Du X (e_1_3_3_21_2) 2020; 13 Li Y (e_1_3_3_32_2) 2019; 49 Zhang T (e_1_3_3_35_2) 2018; 49 Siddharth S (e_1_3_3_2_2) 2019 Haufe S (e_1_3_3_4_2) 2011; 8 Zheng W-L (e_1_3_3_23_2) 2015; 7 Li Y (e_1_3_3_30_2) 2020; 13 e_1_3_3_17_2 e_1_3_3_16_2 e_1_3_3_19_2 Zheng W (e_1_3_3_33_2) 2016; 9 e_1_3_3_39_2 e_1_3_3_36_2 e_1_3_3_37_2 e_1_3_3_15_2 e_1_3_3_34_2 e_1_3_3_14_2 Li W (e_1_3_3_28_2) 2021; 14 Veličković P (e_1_3_3_13_2) 2017; 1050 e_1_3_3_31_2 Zhu J (e_1_3_3_46_2) 2009; 2 Song T (e_1_3_3_29_2) 2018; 11 e_1_3_3_40_2 Li J (e_1_3_3_45_2) 2019; 12 Li Y (e_1_3_3_20_2) 2019; 13 Duvenaud D (e_1_3_3_11_2) 2015; 28 Zhong P (e_1_3_3_26_2) 2020; 13 Hamilton W (e_1_3_3_12_2) 2017; 30 Hu W (e_1_3_3_8_2) 2020; 6 Li J (e_1_3_3_25_2) 2019; 50 Zheng W-L (e_1_3_3_24_2) 2018; 49 Pan SJ (e_1_3_3_38_2) 2010; 22 Teng T (e_1_3_3_3_2) 2017; 19 Hu B (e_1_3_3_6_2) 2022; 2022 Wu D (e_1_3_3_27_2) 2020; 14 Song T (e_1_3_3_47_2) 2019; 7 Ganin Y (e_1_3_3_22_2) 2016; 17 e_1_3_3_9_2 Ju J (e_1_3_3_5_2) 2022; 2022 Hu X (e_1_3_3_7_2) 2019; 5 e_1_3_3_43_2 Li Y (e_1_3_3_18_2) 2021; 12 e_1_3_3_44_2 e_1_3_3_41_2 e_1_3_3_42_2 |
References_xml | – volume: 8 issue: 5 year: 2011 ident: e_1_3_3_4_2 article-title: EEG potentials predict upcoming emergency brakings during simulated driving publication-title: J Neural Eng doi: 10.1088/1741-2560/8/5/056001 – volume: 22 start-page: 199 issue: 2 year: 2010 ident: e_1_3_3_38_2 article-title: Domain adaptation via transfer component analysis publication-title: IEEE Trans Neural Netw doi: 10.1109/TNN.2010.2091281 – ident: e_1_3_3_40_2 doi: 10.1109/ICCV.2013.368 – volume: 6 start-page: 255 issue: 3 year: 2020 ident: e_1_3_3_8_2 article-title: Video-triggered EEG-emotion public databases and current methods: A survey publication-title: Brain Sci Adv doi: 10.26599/BSA.2020.9050026 – volume: 13 start-page: 1290 issue: 3 year: 2020 ident: e_1_3_3_26_2 article-title: EEG-based emotion recognition using regularized graph neural networks publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2020.2994159 – volume: 7 start-page: 12177 year: 2019 ident: e_1_3_3_47_2 article-title: MPED: A multi-modal physiological emotion database for discrete emotion recognition publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2891579 – ident: e_1_3_3_36_2 doi: 10.1023/A:1010933404324 – ident: e_1_3_3_9_2 doi: 10.1609/aaai.v32i1.11496 – volume: 1050 start-page: 10 year: 2017 ident: e_1_3_3_13_2 article-title: Graph attention networks publication-title: Stat – volume: 49 start-page: 839 issue: 3 year: 2018 ident: e_1_3_3_35_2 article-title: Spatial–temporal recurrent neural network for emotion recognition publication-title: IEEE Trans Cybern doi: 10.1109/TCYB.2017.2788081 – volume: 28 start-page: 2224 year: 2015 ident: e_1_3_3_11_2 article-title: Convolutional networks on graphs for learning molecular fingerprints publication-title: Adv Neural Inf Process – volume: 19 start-page: 1766 issue: 6 year: 2017 ident: e_1_3_3_3_2 article-title: EEG-based detection of driver emergency braking intention for brain-controlled vehicles publication-title: IEEE Trans Intell Transp Syst doi: 10.1109/TITS.2017.2740427 – volume: 13 start-page: 354 issue: 2 year: 2020 ident: e_1_3_3_30_2 article-title: A novel bi-hemispheric discrepancy model for EEG emotion recognition publication-title: IEEE Trans Cogn Develop Syst doi: 10.1109/TCDS.2020.2999337 – ident: e_1_3_3_31_2 – ident: e_1_3_3_15_2 – volume: 7 start-page: 162 issue: 3 year: 2015 ident: e_1_3_3_23_2 article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks publication-title: IEEE Trans Auton Ment Dev doi: 10.1109/TAMD.2015.2431497 – ident: e_1_3_3_14_2 – volume: 14 start-page: 4 issue: 1 year: 2020 ident: e_1_3_3_27_2 article-title: Transfer learning for EEG-based brain-computer interfaces: A review of progress made since 2016 publication-title: IEEE Trans Cogn Develop Syst doi: 10.1109/TCDS.2020.3007453 – ident: e_1_3_3_17_2 – volume: 9 start-page: 281 issue: 3 year: 2016 ident: e_1_3_3_33_2 article-title: Multichannel EEG-based emotion recognition via group sparse canonical correlation analysis publication-title: IEEE Trans Cogn Develop Syst doi: 10.1109/TCDS.2016.2587290 – volume: 2 start-page: 349 year: 2009 ident: e_1_3_3_46_2 article-title: Multi-class AdaBoost publication-title: Stat Interface doi: 10.4310/SII.2009.v2.n3.a8 – volume: 13 start-page: 1528 issue: 3 year: 2020 ident: e_1_3_3_21_2 article-title: An efficient LSTM network for emotion recognition from multichannel EEG signals publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2020.3013711 – ident: e_1_3_3_44_2 doi: 10.1007/978-3-030-36708-4_3 – volume: 31 start-page: 917 year: 2023 ident: e_1_3_3_10_2 article-title: A brain network analysis-based double way deep neural network for emotion recognition publication-title: IEEE Trans Neural Syst Rehabil Eng doi: 10.1109/TNSRE.2023.3236434 – ident: e_1_3_3_37_2 doi: 10.1016/S0003-2670(01)95359-0 – volume: 14 start-page: 833 issue: 3 year: 2021 ident: e_1_3_3_28_2 article-title: Can emotion be transferred?—A review on transfer learning for EEG-based emotion recognition publication-title: IEEE Trans Cogn Develop Syst doi: 10.1109/TCDS.2021.3098842 – ident: e_1_3_3_42_2 doi: 10.1109/EMBC46164.2021.9630277 – volume: 12 start-page: 344 issue: 2 year: 2019 ident: e_1_3_3_45_2 article-title: Domain adaptation for EEG emotion recognition based on latent representation similarity publication-title: IEEE Trans Cogn Develop Syst doi: 10.1109/TCDS.2019.2949306 – volume: 2022 year: 2022 ident: e_1_3_3_6_2 article-title: The inverse problems for computational psychophysiology: Opinions and insights publication-title: Cyborg Bionic Syst – volume: 30 start-page: 1025 year: 2017 ident: e_1_3_3_12_2 article-title: Inductive representation learning on large graphs publication-title: Adv Neural Inf Process Syst – volume: 11 start-page: 532 issue: 3 year: 2018 ident: e_1_3_3_29_2 article-title: EEG emotion recognition using dynamical graph convolutional neural networks publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2018.2817622 – ident: e_1_3_3_39_2 doi: 10.1609/aaai.v30i1.10306 – year: 2019 ident: e_1_3_3_2_2 article-title: Utilizing deep learning towards multi-modal biosensing and vision-based affective computing publication-title: IEEE Trans Affect Comput – volume: 17 start-page: 2096 year: 2016 ident: e_1_3_3_22_2 article-title: Domain-adversarial training of neural networks publication-title: J Mach Learn Res – volume: 50 start-page: 3281 issue: 7 year: 2019 ident: e_1_3_3_25_2 article-title: Multisource transfer learning for cross-subject EEG emotion recognition publication-title: IEEE Trans Cybern – volume: 12 start-page: 494 issue: 2 year: 2021 ident: e_1_3_3_18_2 article-title: A Bi-hemisphere domain adversarial neural network model for EEG emotion recognition publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2018.2885474 – ident: e_1_3_3_19_2 doi: 10.1007/978-3-030-04221-9_25 – volume: 49 start-page: 1110 issue: 3 year: 2018 ident: e_1_3_3_24_2 article-title: Emotionmeter: A multimodal framework for recognizing human emotions publication-title: IEEE Trans Cybern doi: 10.1109/TCYB.2018.2797176 – volume: 2022 year: 2022 ident: e_1_3_3_5_2 article-title: Recognition of drivers’ hard and soft braking intentions based on hybrid brain-computer interfaces publication-title: Cyborg Bionic Syst – volume: 49 start-page: 555 year: 2019 ident: e_1_3_3_32_2 article-title: EEG emotion recognition based on graph regularized sparse linear regression publication-title: Neural Process Lett doi: 10.1007/s11063-018-9829-1 – volume: 5 start-page: 1 issue: 1 year: 2019 ident: e_1_3_3_7_2 article-title: Ten challenges for EEG-based affective computing publication-title: Brain Sci Adv doi: 10.1177/2096595819896200 – volume: 13 start-page: 568 issue: 2 year: 2019 ident: e_1_3_3_20_2 article-title: From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2019.2922912 – ident: e_1_3_3_16_2 – ident: e_1_3_3_41_2 doi: 10.1109/CVPR.2012.6247911 – ident: e_1_3_3_43_2 doi: 10.1007/978-3-030-04221-9_36 – ident: e_1_3_3_34_2 doi: 10.1109/CVPR.2018.00835 |
SSID | ssib046561572 ssj0002875728 |
Score | 2.3531375 |
Snippet | An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare,... |
SourceID | doaj pubmedcentral proquest pubmed crossref nii |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 0088 |
SubjectTerms | Cybernetics Q300-390 |
Title | Adaptive Spatial–Temporal Aware Graph Learning for EEG-Based Emotion Recognition |
URI | https://cir.nii.ac.jp/crid/1871147691541208448 https://www.ncbi.nlm.nih.gov/pubmed/40391296 https://www.proquest.com/docview/3205820776 https://pubmed.ncbi.nlm.nih.gov/PMC12087903 https://doaj.org/article/fe462fe5cdb84dce99919c943804921e |
Volume | 5 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT9wwELYqTlwqHm0JL7lSD_QQiBM_4uNutYB66AGBxC3yExahLGIXceU_8A_7SzoTZ1e7qKiXXnKIY8UZj_1944y-IeQbUI5KMmdzaesq5xwCFKNMkWsTJcKHE6bL8v0lz6_4z2txvVTqC3PCkjxwMtxJDFyWMQjnbc29C0hotNMolM51yQLuvoB5S8HUXXdkpIQq66Tlgxt1deJs0kaeoqJpvQJDnVo_gEs7Hv-NaL7Nl1wCoNMN8rFnjnSQRrxJPoR2i2z2a3NKj3oB6e_b5GLgzQNuYxQLDoOD_X55vUwSVND_2TwGeoY61bQXV72hwFzpaHSWDwHTPB2l0j70Yp5cNGk_kavT0eWP87yvnZA7IcQsZ9Ej7hhd2whfr0U0MF2GmeAg5FNSWW0KC2xO8FJ5gHnMj4lYeINF5oEXfiZr7aQNO4RWKOCtvJTCWYiPlLYA89wwpWzwKrqMHM9N2bheWBzrW9w3EGB0tm8Wtm_Q9hk5WnR4SJoa7z86xLlZPIZi2N0NcJGmd5HmXy6SkQOYWRgbXhlEiIwrqYE7srKoITzNyNf5nDewuPCPiWnD5GnaVGUhgCIpJTPyJfnAYigctfVLDS31inesjHW1pR3fdgLe-F6li2r3f3zdHlkvgWilY6F9sjZ7fAoHQJRm9rBbE4fdCdYfb_4SPw |
linkProvider | Directory of Open Access Journals |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Adaptive+Spatial-Temporal+Aware+Graph+Learning+for+EEG-Based+Emotion+Recognition&rft.jtitle=Cyborg+and+bionic+systems&rft.au=Ye%2C+Weishan&rft.au=Wang%2C+Jiyuan&rft.au=Chen%2C+Lin&rft.au=Dai%2C+Lifei&rft.date=2024-01-01&rft.issn=2692-7632&rft.eissn=2692-7632&rft.volume=5&rft.spage=0088&rft_id=info:doi/10.34133%2Fcbsystems.0088&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2692-7632&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2692-7632&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2692-7632&client=summon |