Cross-Subject Emotion Recognition Brain–Computer Interface Based on fNIRS and DBJNet
Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to its advantages of high spatial resolution, real time, and convenience. However, the current research on emotion recognition based on fNIRS is...
Saved in:
Published in | Cyborg and bionic systems Vol. 4; p. 0045 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
AAAS
2023
American Association for the Advancement of Science (AAAS) |
Online Access | Get full text |
ISSN | 2692-7632 2097-1087 2692-7632 |
DOI | 10.34133/cbsystems.0045 |
Cover
Loading…
Abstract | Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to its advantages of high spatial resolution, real time, and convenience. However, the current research on emotion recognition based on fNIRS is mainly limited to within-subject, and there is a lack of related work on emotion recognition across subjects. Therefore, in this paper, we designed an emotion evoking experiment with videos as stimuli and constructed the fNIRS emotion recognition database. On this basis, deep learning technology was introduced for the first time, and a dual-branch joint network (DBJNet) was constructed, creating the ability to generalize the model to new participants. The decoding performance obtained by the proposed model shows that fNIRS can effectively distinguish positive versus neutral versus negative emotions (accuracy is 74.8%, F1 score is 72.9%), and the decoding performance on the 2-category emotion recognition task of distinguishing positive versus neutral (accuracy is 89.5%, F1 score is 88.3%), negative versus neutral (accuracy is 91.7%, F1 score is 91.1%) proved fNIRS has a powerful ability to decode emotions. Furthermore, the results of the ablation study of the model structure demonstrate that the joint convolutional neural network branch and the statistical branch achieve the highest decoding performance. The work in this paper is expected to facilitate the development of fNIRS affective brain–computer interface. |
---|---|
AbstractList | Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to its advantages of high spatial resolution, real time, and convenience. However, the current research on emotion recognition based on fNIRS is mainly limited to within-subject, and there is a lack of related work on emotion recognition across subjects. Therefore, in this paper, we designed an emotion evoking experiment with videos as stimuli and constructed the fNIRS emotion recognition database. On this basis, deep learning technology was introduced for the first time, and a dual-branch joint network (DBJNet) was constructed, creating the ability to generalize the model to new participants. The decoding performance obtained by the proposed model shows that fNIRS can effectively distinguish positive versus neutral versus negative emotions (accuracy is 74.8%, F1 score is 72.9%), and the decoding performance on the 2-category emotion recognition task of distinguishing positive versus neutral (accuracy is 89.5%, F1 score is 88.3%), negative versus neutral (accuracy is 91.7%, F1 score is 91.1%) proved fNIRS has a powerful ability to decode emotions. Furthermore, the results of the ablation study of the model structure demonstrate that the joint convolutional neural network branch and the statistical branch achieve the highest decoding performance. The work in this paper is expected to facilitate the development of fNIRS affective brain-computer interface. Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to its advantages of high spatial resolution, real time, and convenience. However, the current research on emotion recognition based on fNIRS is mainly limited to within-subject, and there is a lack of related work on emotion recognition across subjects. Therefore, in this paper, we designed an emotion evoking experiment with videos as stimuli and constructed the fNIRS emotion recognition database. On this basis, deep learning technology was introduced for the first time, and a dual-branch joint network (DBJNet) was constructed, creating the ability to generalize the model to new participants. The decoding performance obtained by the proposed model shows that fNIRS can effectively distinguish positive versus neutral versus negative emotions (accuracy is 74.8%, F1 score is 72.9%), and the decoding performance on the 2-category emotion recognition task of distinguishing positive versus neutral (accuracy is 89.5%, F1 score is 88.3%), negative versus neutral (accuracy is 91.7%, F1 score is 91.1%) proved fNIRS has a powerful ability to decode emotions. Furthermore, the results of the ablation study of the model structure demonstrate that the joint convolutional neural network branch and the statistical branch achieve the highest decoding performance. The work in this paper is expected to facilitate the development of fNIRS affective brain-computer interface.Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to its advantages of high spatial resolution, real time, and convenience. However, the current research on emotion recognition based on fNIRS is mainly limited to within-subject, and there is a lack of related work on emotion recognition across subjects. Therefore, in this paper, we designed an emotion evoking experiment with videos as stimuli and constructed the fNIRS emotion recognition database. On this basis, deep learning technology was introduced for the first time, and a dual-branch joint network (DBJNet) was constructed, creating the ability to generalize the model to new participants. The decoding performance obtained by the proposed model shows that fNIRS can effectively distinguish positive versus neutral versus negative emotions (accuracy is 74.8%, F1 score is 72.9%), and the decoding performance on the 2-category emotion recognition task of distinguishing positive versus neutral (accuracy is 89.5%, F1 score is 88.3%), negative versus neutral (accuracy is 91.7%, F1 score is 91.1%) proved fNIRS has a powerful ability to decode emotions. Furthermore, the results of the ablation study of the model structure demonstrate that the joint convolutional neural network branch and the statistical branch achieve the highest decoding performance. The work in this paper is expected to facilitate the development of fNIRS affective brain-computer interface. |
Author | Si, Xiaopeng Yu, Jiayue Ming, Dong He, Huang |
AuthorAffiliation | 2 Tianjin Key Laboratory of Brain Science and Neural Engineering , Tianjin University , Tianjin 300072, People’s Republic of China 3 Tianjin International Engineering Institute , Tianjin University , Tianjin 300072, People’s Republic of China 1 Academy of Medical Engineering and Translational Medicine , Tianjin University , Tianjin 300072, People’s Republic of China |
AuthorAffiliation_xml | – name: 1 Academy of Medical Engineering and Translational Medicine , Tianjin University , Tianjin 300072, People’s Republic of China – name: 3 Tianjin International Engineering Institute , Tianjin University , Tianjin 300072, People’s Republic of China – name: 2 Tianjin Key Laboratory of Brain Science and Neural Engineering , Tianjin University , Tianjin 300072, People’s Republic of China |
Author_xml | – sequence: 1 givenname: Xiaopeng orcidid: 0000-0002-8956-7577 surname: Si fullname: Si, Xiaopeng organization: Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People’s Republic of China., Tianjin Key Laboratory of Brain Science and Neural Engineering, Tianjin University, Tianjin 300072, People’s Republic of China – sequence: 2 givenname: Huang surname: He fullname: He, Huang organization: Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People’s Republic of China., Tianjin Key Laboratory of Brain Science and Neural Engineering, Tianjin University, Tianjin 300072, People’s Republic of China – sequence: 3 givenname: Jiayue surname: Yu fullname: Yu, Jiayue organization: Tianjin Key Laboratory of Brain Science and Neural Engineering, Tianjin University, Tianjin 300072, People’s Republic of China., Tianjin International Engineering Institute, Tianjin University, Tianjin 300072, People’s Republic of China – sequence: 4 givenname: Dong surname: Ming fullname: Ming, Dong organization: Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People’s Republic of China., Tianjin Key Laboratory of Brain Science and Neural Engineering, Tianjin University, Tianjin 300072, People’s Republic of China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37519929$$D View this record in MEDLINE/PubMed |
BookMark | eNp1kctu1DAUhiNURC90zQ5lySatb7GdFWKGAoOqIrXA1nLsk8GjxJ7aDlJ3vEPfkCchM1NGLRIb-9j-z3csfcfFgQ8eiuIVRmeUYUrPTZvuUoYhnSHE6mfFEeENqQSn5OBRfVicprRCCBEpakHki-KQiho3DWmOiu_zGFKqbsZ2BSaXF0PILvjyGkxYeretZ1E7__vX_TwM6zFDLBd-WjttoJzpBLacMt3V4vqm1N6W72efryC_LJ53uk9w-rCfFN8-XHydf6ouv3xczN9dVoZxlCsuMeOstYYYYywjUHMMtRUCgPOOMd221MJ0xJJb0TGhG4M4toSjDiSV9KRY7Lg26JVaRzfoeKeCdmp7EeJS6Zid6UFpqtsOSCuMNazhRFuBsK7rTnRUSq4n1tsdaz22A1gDPkfdP4E-ffHuh1qGnwojKhhh9UR480CI4XaElNXgkoG-1x7CmBSRjCEpac2m6OvHw_ZT_pqZAue7gNkIitDtIxiprX611682-qeO-p8O47LeKJx-6_r_9v0Blei5Fw |
CitedBy_id | crossref_primary_10_1016_j_jisa_2024_103761 crossref_primary_10_1016_j_neuroimage_2024_120740 crossref_primary_10_3389_fmed_2025_1540297 crossref_primary_10_1016_j_jestch_2024_101829 crossref_primary_10_3390_electronics13234797 crossref_primary_10_1016_j_bspc_2025_107528 crossref_primary_10_1109_ACCESS_2024_3443812 crossref_primary_10_1016_j_neuroscience_2024_02_011 crossref_primary_10_1109_ACCESS_2024_3411717 crossref_primary_10_1080_26941899_2024_2426785 crossref_primary_10_1007_s10723_024_09766_2 crossref_primary_10_1016_j_aei_2024_102831 crossref_primary_10_1007_s11277_024_11217_w crossref_primary_10_1016_j_eswa_2024_126081 crossref_primary_10_1109_TBCAS_2023_3316969 crossref_primary_10_3390_brainsci14080820 crossref_primary_10_1007_s11277_024_11154_8 crossref_primary_10_1016_j_diamond_2024_111448 crossref_primary_10_1109_ACCESS_2024_3496867 |
Cites_doi | 10.1109/TCYB.2018.2797176 10.1016/j.ijhcs.2017.10.001 10.1142/S0129065720500197 10.1145/3383812.3383844 10.1109/TAFFC.2019.2947464 10.1016/j.neuroimage.2013.06.062 10.3389/fnins.2021.693623 10.1088/1741-2552/aace8c 10.2174/1874440001105010033 10.1682/JRRD.2010.02.0017 10.1109/ACII.2013.156 10.1109/TAFFC.2022.3169001 10.3389/fnhum.2019.00120 10.1038/s41593-019-0488-y 10.1016/j.tics.2021.04.003 10.3389/fnhum.2013.00871 10.1088/1741-2552/ab6cb9 10.1109/NER.2013.6696175 10.1007/s10803-009-0700-0 10.1117/1.NPh.9.4.041411 10.1016/j.neuropsychologia.2015.03.013 10.1016/j.neuroimage.2004.07.011 10.1088/1741-2560/9/2/026022 10.1016/j.tics.2010.11.004 10.1109/T-AFFC.2010.1 |
ContentType | Journal Article |
Copyright | Copyright © 2023 Xiaopeng Si et al. Copyright © 2023 Xiaopeng Si et al. Copyright © 2023 Xiaopeng Si et al. 2023 Xiaopeng Si et al. |
Copyright_xml | – notice: Copyright © 2023 Xiaopeng Si et al. – notice: Copyright © 2023 Xiaopeng Si et al. – notice: Copyright © 2023 Xiaopeng Si et al. 2023 Xiaopeng Si et al. |
DBID | AAYXX CITATION NPM 7X8 5PM DOA |
DOI | 10.34133/cbsystems.0045 |
DatabaseName | CrossRef PubMed MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
DatabaseTitleList | PubMed MEDLINE - Academic CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2692-7632 |
ExternalDocumentID | oai_doaj_org_article_a3abfe2b7cdc4962ad701a55f7f3886a PMC10374245 37519929 10_34133_cbsystems_0045 |
Genre | Journal Article |
GroupedDBID | AAYXX ALMA_UNASSIGNED_HOLDINGS CITATION GROUPED_DOAJ M~E OK1 PGMZT RPM NPM 7X8 5PM |
ID | FETCH-LOGICAL-c460t-681464bdc2cccd42e561e5d77ee66f44abb3de77e186d7f47a9c061d260fe8383 |
IEDL.DBID | DOA |
ISSN | 2692-7632 2097-1087 |
IngestDate | Wed Aug 27 01:30:46 EDT 2025 Thu Aug 21 18:36:41 EDT 2025 Fri Jul 11 05:52:49 EDT 2025 Thu Jan 02 22:51:45 EST 2025 Thu Apr 24 22:58:24 EDT 2025 Tue Jul 01 04:22:07 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | Copyright © 2023 Xiaopeng Si et al. Exclusive licensee Beijing Institute of Technology Press. No claim to original U.S. Government Works. Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c460t-681464bdc2cccd42e561e5d77ee66f44abb3de77e186d7f47a9c061d260fe8383 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 These authors contributed equally to this work. |
ORCID | 0000-0002-8956-7577 |
OpenAccessLink | https://doaj.org/article/a3abfe2b7cdc4962ad701a55f7f3886a |
PMID | 37519929 |
PQID | 2844088354 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_a3abfe2b7cdc4962ad701a55f7f3886a pubmedcentral_primary_oai_pubmedcentral_nih_gov_10374245 proquest_miscellaneous_2844088354 pubmed_primary_37519929 crossref_primary_10_34133_cbsystems_0045 crossref_citationtrail_10_34133_cbsystems_0045 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-00-00 |
PublicationDateYYYYMMDD | 2023-01-01 |
PublicationDate_xml | – year: 2023 text: 2023-00-00 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States |
PublicationTitle | Cyborg and bionic systems |
PublicationTitleAlternate | Cyborg Bionic Syst |
PublicationYear | 2023 |
Publisher | AAAS American Association for the Advancement of Science (AAAS) |
Publisher_xml | – name: AAAS – name: American Association for the Advancement of Science (AAAS) |
References | Lawhern VJ (e_1_3_3_29_2) 2018; 15 Zeng Z (e_1_3_3_7_2) 2007; 31 Piper SK (e_1_3_3_15_2) 2014; 85 Li Y (e_1_3_3_30_2) 2020; 30 Crosson B (e_1_3_3_27_2) 2010; 47 Zhang S (e_1_3_3_8_2) 2019; 13 Glotzbach E (e_1_3_3_10_2) 2011; 5 Tai K (e_1_3_3_19_2) 2009; 6 e_1_3_3_12_2 Kreplin U (e_1_3_3_16_2) 2015; 71 Si X (e_1_3_3_18_2) 2021; 15 Si X (e_1_3_3_17_2) 2021; 18 Moghimi S (e_1_3_3_20_2) 2012; 9 Calvo RA (e_1_3_3_2_2) 2010; 1 e_1_3_3_11_2 e_1_3_3_31_2 Bandara D (e_1_3_3_23_2) 2018; 110 Gao X (e_1_3_3_5_2) 2021; 25 Eastmond C (e_1_3_3_24_2) 2022; 9 Kuusikko S (e_1_3_3_3_2) 2009; 39 Wu X (e_1_3_3_28_2) 2022; 19 Hu X (e_1_3_3_22_2) 2019; 13 Zheng W-L (e_1_3_3_25_2) 2018; 49 Ayaz H (e_1_3_3_13_2) 2013; 7 e_1_3_3_6_2 e_1_3_3_9_2 Boas DA (e_1_3_3_14_2) 2004; 23 e_1_3_3_26_2 Nagasawa T (e_1_3_3_32_2) 2020; 17 e_1_3_3_4_2 e_1_3_3_21_2 |
References_xml | – volume: 49 start-page: 1110 issue: 3 year: 2018 ident: e_1_3_3_25_2 article-title: Emotionmeter: A multimodal framework for recognizing human emotions publication-title: IEEE Trans cybernet doi: 10.1109/TCYB.2018.2797176 – volume: 110 start-page: 75 year: 2018 ident: e_1_3_3_23_2 article-title: Building predictive models of emotion with functional near-infrared spectroscopy publication-title: Intl J Human-Comput Stud doi: 10.1016/j.ijhcs.2017.10.001 – ident: e_1_3_3_6_2 – volume: 30 issue: 04 year: 2020 ident: e_1_3_3_30_2 article-title: Automatic seizure detection using fully convolutional nested LSTM publication-title: Int J Neural Syst doi: 10.1142/S0129065720500197 – ident: e_1_3_3_26_2 doi: 10.1145/3383812.3383844 – volume: 13 start-page: 680 issue: 2 year: 2019 ident: e_1_3_3_8_2 article-title: Spontaneous speech emotion recognition using multiscale deep convolutional LSTM publication-title: IEEE Trans Affect Comput doi: 10.1109/TAFFC.2019.2947464 – volume: 85 start-page: 64 year: 2014 ident: e_1_3_3_15_2 article-title: A wearable multi-channel fNIRS system for brain imaging in freely moving subjects publication-title: NeuroImage doi: 10.1016/j.neuroimage.2013.06.062 – volume: 15 year: 2021 ident: e_1_3_3_18_2 article-title: Acupuncture with deqi modulates the hemodynamic response and functional connectivity of the prefrontal-motor cortical network publication-title: Front Neurosci doi: 10.3389/fnins.2021.693623 – ident: e_1_3_3_11_2 – volume: 15 issue: 5 year: 2018 ident: e_1_3_3_29_2 article-title: EEGNet: A compact convolutional neural network for EEG-based brain–computer interfaces publication-title: J Neural Eng doi: 10.1088/1741-2552/aace8c – volume: 5 start-page: 33 year: 2011 ident: e_1_3_3_10_2 article-title: Prefrontal brain activation during emotional processing: A functional near infrared spectroscopy study (fNIRS) publication-title: Open Neuroimag J doi: 10.2174/1874440001105010033 – volume: 47 start-page: vii issue: 2 year: 2010 ident: e_1_3_3_27_2 article-title: Functional imaging and related techniques: An introduction for rehabilitation researchers publication-title: J Rehabil Res Dev doi: 10.1682/JRRD.2010.02.0017 – ident: e_1_3_3_21_2 doi: 10.1109/ACII.2013.156 – volume: 18 issue: 5 year: 2021 ident: e_1_3_3_17_2 article-title: Imagined speech increases the hemodynamic response and functional connectivity of the dorsal motor cortex publication-title: J Neural Eng – ident: e_1_3_3_31_2 doi: 10.1109/TAFFC.2022.3169001 – volume: 13 year: 2019 ident: e_1_3_3_22_2 article-title: fNIRS evidence for recognizably different positive emotions publication-title: Front Hum Neurosci doi: 10.3389/fnhum.2019.00120 – ident: e_1_3_3_4_2 doi: 10.1038/s41593-019-0488-y – volume: 25 start-page: 671 issue: 8 year: 2021 ident: e_1_3_3_5_2 article-title: Interface, interaction, and intelligence in generalized brain–computer interfaces publication-title: Trends Cogn Sci doi: 10.1016/j.tics.2021.04.003 – volume: 7 year: 2013 ident: e_1_3_3_13_2 article-title: Continuous monitoring of brain dynamics with functional near infrared spectroscopy as a tool for neuroergonomic research: Empirical examples and a technological development publication-title: Front Hum Neurosci doi: 10.3389/fnhum.2013.00871 – volume: 6 start-page: 1 year: 2009 ident: e_1_3_3_19_2 article-title: Single-trial classification of NIRS signals during emotional induction tasks: Towards a corporeal machine interface publication-title: J Neuroeng Rehabil – volume: 19 issue: 1 year: 2022 ident: e_1_3_3_28_2 article-title: Investigating EEG-based functional connectivity patterns for multimodal emotion recognition publication-title: J Neural Eng – volume: 17 issue: 1 year: 2020 ident: e_1_3_3_32_2 article-title: fNIRS-GANs: Data augmentation using generative adversarial networks for classifying motor tasks from functional near-infrared spectroscopy publication-title: J Neural Eng doi: 10.1088/1741-2552/ab6cb9 – ident: e_1_3_3_12_2 doi: 10.1109/NER.2013.6696175 – volume: 39 start-page: 938 year: 2009 ident: e_1_3_3_3_2 article-title: Emotion recognition in children and adolescents with autism spectrum disorders publication-title: J Autism Dev Disord doi: 10.1007/s10803-009-0700-0 – volume: 31 start-page: 126 issue: 1 year: 2007 ident: e_1_3_3_7_2 article-title: A survey of affect recognition methods: audio, visual and spontaneous expressions publication-title: IEEE Trans Patt Anal Mach Intell – volume: 9 issue: 4 year: 2022 ident: e_1_3_3_24_2 article-title: Deep learning in fNIRS: A review publication-title: Neurophotonics doi: 10.1117/1.NPh.9.4.041411 – volume: 71 start-page: 38 year: 2015 ident: e_1_3_3_16_2 article-title: Effects of self-directed and other-directed introspection and emotional valence on activation of the rostral prefrontal cortex during aesthetic experience publication-title: Neuropsychologia doi: 10.1016/j.neuropsychologia.2015.03.013 – volume: 23 start-page: S275 year: 2004 ident: e_1_3_3_14_2 article-title: Diffuse optical imaging of brain activation: Approaches to optimizing image sensitivity, resolution, and accuracy publication-title: NeuroImage doi: 10.1016/j.neuroimage.2004.07.011 – volume: 9 issue: 2 year: 2012 ident: e_1_3_3_20_2 article-title: Automatic detection of a prefrontal cortical response to emotionally rated music using multi-channel near-infrared spectroscopy publication-title: J Neural Eng doi: 10.1088/1741-2560/9/2/026022 – ident: e_1_3_3_9_2 doi: 10.1016/j.tics.2010.11.004 – volume: 1 start-page: 18 issue: 1 year: 2010 ident: e_1_3_3_2_2 article-title: Affect detection: An interdisciplinary review of models, methods, and their applications publication-title: IEEE Trans Affect Comput doi: 10.1109/T-AFFC.2010.1 |
SSID | ssj0002875728 |
Score | 2.3593829 |
Snippet | Functional near-infrared spectroscopy (fNIRS) is a noninvasive brain imaging technique that has gradually been applied in emotion recognition research due to... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 0045 |
Title | Cross-Subject Emotion Recognition Brain–Computer Interface Based on fNIRS and DBJNet |
URI | https://www.ncbi.nlm.nih.gov/pubmed/37519929 https://www.proquest.com/docview/2844088354 https://pubmed.ncbi.nlm.nih.gov/PMC10374245 https://doaj.org/article/a3abfe2b7cdc4962ad701a55f7f3886a |
Volume | 4 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV07b9swECYCT12KOn2pTQIWyJAOsmWRIukxTmI4GTwkdeFN4BNtUShF7ez9D_2H_SW9IyXBDlp06aYHBRLHI--74-k7Qk5V5RUzpchLLXzOwfnKtefguHJA85ZxPJvDbIulWKz4zbpa75T6wpywRA-cBDfWTJvgSyOts3wqSu1kMdFVFWRgSokIjcDm7ThTX2LISFayVInLBzdqNrYmcSNvRghj9sxQZOv_E8R8nCm5Y3rmz8jTFjPS8zTWITnwzSEZtqtyQ89a6uj3z8nHC-wth90Awyv0KtXoobddlhBcz7AmxK8fP7tyDjTGBIO2ns7AojkKbcLy-vaO6sbRy9nN0m9fkNX86sPFIm8rJ-SWi2KbCwzsceNsaa11vPSAknzlpPReiMC5NoY5D7cTJZwMXOqpBcPuwLkJMHeKvSSD5r7xrwktCq2VgYUthQXwYjWzE600c0WYCuUnGRl1gqxtSyuO1S2-1uBeRMnXveRrlHxGzvoPviVGjb83neHM9M2QCjs-AAWpWwWp_6UgGXnXzWsNSwfPQ3Tj7x82NVhmDpssq3hGXqV57rtissLE3GlG1J4G7I1l_03z-VOk58Y_L_E8-c3_GP1b8gQL3KegzxEZbL8_-GOAQVtzEjX-JManfgOKOgua |
linkProvider | Directory of Open Access Journals |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Cross-Subject+Emotion+Recognition+Brain-Computer+Interface+Based+on+fNIRS+and+DBJNet&rft.jtitle=Cyborg+and+bionic+systems&rft.au=Si%2C+Xiaopeng&rft.au=He%2C+Huang&rft.au=Yu%2C+Jiayue&rft.au=Ming%2C+Dong&rft.date=2023&rft.issn=2692-7632&rft.eissn=2692-7632&rft.volume=4&rft.spage=0045&rft_id=info:doi/10.34133%2Fcbsystems.0045&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2692-7632&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2692-7632&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2692-7632&client=summon |