E2ENNet: An end-to-end neural network for emotional brain-computer interface

Objectve: Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering c...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in computational neuroscience Vol. 16; p. 942979
Main Authors Han, Zhichao, Chang, Hongli, Zhou, Xiaoyan, Wang, Jihao, Wang, Lili, Shao, Yongbin
Format Journal Article
LanguageEnglish
Published Lausanne Frontiers Research Foundation 12.08.2022
Frontiers Media S.A
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Objectve: Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering complexity and limiting practical applications in traditional EEG emotion recognition tasks. We propose an end-to-end neural network, i.e., E2ENNet. Methods: Baseline removal and sliding window slice used for preprocessing of the raw EEG signal, convolution blocks extracted features, LSTM network obtained the correlations of features, and the softmax function classified emotions. Results: Extensive experiments in subject-dependent experimental protocol are conducted to evaluate the performance of the proposed E2ENNet, achieves state-of-the-art accuracy on three public datasets, i.e., 96.28% of 2-category experiment on DEAP dataset, 98.1% of 2-category experiment on DREAMER dataset, and 41.73% of 7-category experiment on MPED dataset. Conclusion: Experimental results show that E2ENNet can directly extract more discriminative features from raw EEG signals. Significance: This study provides a methodology for implementing a plug-and-play emotional brain-computer interface system.
AbstractList ObjectveEmotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering complexity and limiting practical applications in traditional EEG emotion recognition tasks. We propose an end-to-end neural network, i.e., E2ENNet.MethodsBaseline removal and sliding window slice used for preprocessing of the raw EEG signal, convolution blocks extracted features, LSTM network obtained the correlations of features, and the softmax function classified emotions.ResultsExtensive experiments in subject-dependent experimental protocol are conducted to evaluate the performance of the proposed E2ENNet, achieves state-of-the-art accuracy on three public datasets, i.e., 96.28% of 2-category experiment on DEAP dataset, 98.1% of 2-category experiment on DREAMER dataset, and 41.73% of 7-category experiment on MPED dataset.ConclusionExperimental results show that E2ENNet can directly extract more discriminative features from raw EEG signals.SignificanceThis study provides a methodology for implementing a plug-and-play emotional brain-computer interface system.
Objectve: Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering complexity and limiting practical applications in traditional EEG emotion recognition tasks. We propose an end-to-end neural network, i.e., E2ENNet. Methods: Baseline removal and sliding window slice used for preprocessing of the raw EEG signal, convolution blocks extracted features, LSTM network obtained the correlations of features, and the softmax function classified emotions. Results: Extensive experiments in subject-dependent experimental protocol are conducted to evaluate the performance of the proposed E2ENNet, achieves state-of-the-art accuracy on three public datasets, i.e., 96.28% of 2-category experiment on DEAP dataset, 98.1% of 2-category experiment on DREAMER dataset, and 41.73% of 7-category experiment on MPED dataset. Conclusion: Experimental results show that E2ENNet can directly extract more discriminative features from raw EEG signals. Significance: This study provides a methodology for implementing a plug-and-play emotional brain-computer interface system.
Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering complexity and limiting practical applications in traditional EEG emotion recognition tasks. We propose an end-to-end neural network, i.e., E2ENNet.ObjectveEmotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the existing EEG emotion recognition is carried out step by step in feature engineering and classification, resulting in high engineering complexity and limiting practical applications in traditional EEG emotion recognition tasks. We propose an end-to-end neural network, i.e., E2ENNet.Baseline removal and sliding window slice used for preprocessing of the raw EEG signal, convolution blocks extracted features, LSTM network obtained the correlations of features, and the softmax function classified emotions.MethodsBaseline removal and sliding window slice used for preprocessing of the raw EEG signal, convolution blocks extracted features, LSTM network obtained the correlations of features, and the softmax function classified emotions.Extensive experiments in subject-dependent experimental protocol are conducted to evaluate the performance of the proposed E2ENNet, achieves state-of-the-art accuracy on three public datasets, i.e., 96.28% of 2-category experiment on DEAP dataset, 98.1% of 2-category experiment on DREAMER dataset, and 41.73% of 7-category experiment on MPED dataset.ResultsExtensive experiments in subject-dependent experimental protocol are conducted to evaluate the performance of the proposed E2ENNet, achieves state-of-the-art accuracy on three public datasets, i.e., 96.28% of 2-category experiment on DEAP dataset, 98.1% of 2-category experiment on DREAMER dataset, and 41.73% of 7-category experiment on MPED dataset.Experimental results show that E2ENNet can directly extract more discriminative features from raw EEG signals.ConclusionExperimental results show that E2ENNet can directly extract more discriminative features from raw EEG signals.This study provides a methodology for implementing a plug-and-play emotional brain-computer interface system.SignificanceThis study provides a methodology for implementing a plug-and-play emotional brain-computer interface system.
Author Shao, Yongbin
Han, Zhichao
Chang, Hongli
Wang, Lili
Zhou, Xiaoyan
Wang, Jihao
AuthorAffiliation 2 The Key Laboratory of Child Development and Learning Science of Ministry of Education, Southeast University, Southeast University , Nanjing , China
1 School of Electronic and Information Engineering, Nanjing University of Information Science and Technology , Nanjing , China
AuthorAffiliation_xml – name: 2 The Key Laboratory of Child Development and Learning Science of Ministry of Education, Southeast University, Southeast University , Nanjing , China
– name: 1 School of Electronic and Information Engineering, Nanjing University of Information Science and Technology , Nanjing , China
Author_xml – sequence: 1
  givenname: Zhichao
  surname: Han
  fullname: Han, Zhichao
– sequence: 2
  givenname: Hongli
  surname: Chang
  fullname: Chang, Hongli
– sequence: 3
  givenname: Xiaoyan
  surname: Zhou
  fullname: Zhou, Xiaoyan
– sequence: 4
  givenname: Jihao
  surname: Wang
  fullname: Wang, Jihao
– sequence: 5
  givenname: Lili
  surname: Wang
  fullname: Wang, Lili
– sequence: 6
  givenname: Yongbin
  surname: Shao
  fullname: Shao, Yongbin
BookMark eNp1UctuFDEQtFAQecAHcBuJC5dZ_BzbHJCiaEkircIFzpbH7gleZuzF4wni7_HuBolE4uKyuquq1V3n6CSmCAi9JXjFmNIfhujStKKY0pXmVEv9Ap2RrqOtIEqd_PM_RefzvMW4o53Ar9Ap6zDjmokztFnT9d0dlI_NZWwg-raktkITYcl2rFB-pfyjGVJuYEolpFirfbYhtnX2bimQmxDrO1gHr9HLwY4zvHnEC_Tt8_rr1U27-XJ9e3W5aR2XuLScMdtTEMC4cBj3Snineoe5tCC8p9wzYonSSjnnOe486Z2lXijC8ODBswt0e_T1yW7NLofJ5t8m2WAOhZTvjc0luBGM5NQRIaQdMOHMgvaaSE8ZcbgTlkL1-nT02i39BN5BLHXxJ6ZPOzF8N_fpwWhOmGKyGrx_NMjp5wJzMVOYHYyjjZCW2VCJpZKcdHvqu2fUbVpyveiBRRihipDKkkeWy2meMwzGhWL3p6_zw2gINvv0zSF9s0_fHNOvSvJM-XeN_2v-AKvwtAs
CitedBy_id crossref_primary_10_1016_j_dsp_2023_104278
crossref_primary_10_3390_s24154837
crossref_primary_10_3389_fnins_2023_1183132
crossref_primary_10_3389_fnins_2023_1213099
crossref_primary_10_3390_brainsci13070977
crossref_primary_10_3390_s23020915
Cites_doi 10.1109/TAFFC.2020.3025777
10.1109/TCDS.2020.2999337
10.1007/s10726-008-9153-7
10.1109/ICRA.2016.7487478
10.1109/TNSRE.2019.2914916
10.1080/2326263X.2014.912881
10.1016/j.neuroimage.2005.11.027
10.1016/j.tics.2012.09.005
10.1109/TNSRE.2019.2956488
10.1109/TNNLS.2021.3118468
10.1088/1742-6596/1966/1/012043
10.1109/JBHI.2017.2688239
10.1109/NER.2013.6695876
10.1016/j.neunet.2014.06.012
10.1109/ICICS.2018.00071
10.14569/IJACSA.2017.081046
10.1109/TAFFC.2018.2885474
10.1016/j.asoc.2020.106954
10.1007/978-3-030-36808-1_75
10.1109/34.954607
10.1109/ACCESS.2019.2949707
10.1109/T-AFFC.2011.15
10.1162/neco.1997.9.8.1735
10.15171/icnj.2017.01
10.3778/j.issn.1002-8331.2010-0263
10.48550/arXiv.1505.07818
10.1016/j.tics.2010.11.004
10.1088/1741-2552/aace8c
10.1109/BIBM.2018.8621147
10.1109/TAFFC.2017.2714671
10.3969/j.issn.0255-8297.2021.03.001
10.1109/ACCESS.2019.2891579
10.1109/79.911197
10.1016/j.neucom.2017.08.039
10.1109/TCYB.2017.2788081
ContentType Journal Article
Copyright 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright © 2022 Han, Chang, Zhou, Wang, Wang and Shao.
Copyright © 2022 Han, Chang, Zhou, Wang, Wang and Shao. 2022 Han, Chang, Zhou, Wang, Wang and Shao
Copyright_xml – notice: 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: Copyright © 2022 Han, Chang, Zhou, Wang, Wang and Shao.
– notice: Copyright © 2022 Han, Chang, Zhou, Wang, Wang and Shao. 2022 Han, Chang, Zhou, Wang, Wang and Shao
DBID AAYXX
CITATION
3V.
7XB
88I
8FE
8FH
8FK
ABUWG
AFKRA
AZQEC
BBNVY
BENPR
BHPHI
CCPQU
DWQXO
GNUQQ
HCIFZ
LK8
M2P
M7P
PHGZM
PHGZT
PIMPY
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
7X8
5PM
DOA
DOI 10.3389/fncom.2022.942979
DatabaseName CrossRef
ProQuest Central (Corporate)
ProQuest Central (purchase pre-March 2016)
Science Database (Alumni Edition)
ProQuest SciTech Collection
ProQuest Natural Science Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
Biological Science Collection
ProQuest Central
Natural Science Collection
ProQuest One
ProQuest Central Korea
ProQuest Central Student
SciTech Premium Collection
ProQuest Biological Science Collection
Science Database
Biological Science Database
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Natural Science Collection
ProQuest Central China
ProQuest Central
ProQuest One Applied & Life Sciences
Natural Science Collection
ProQuest Central Korea
Biological Science Collection
ProQuest Central (New)
ProQuest Science Journals (Alumni Edition)
ProQuest Biological Science Collection
ProQuest Central Basic
ProQuest Science Journals
ProQuest One Academic Eastern Edition
Biological Science Database
ProQuest SciTech Collection
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList
Publicly Available Content Database
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Anatomy & Physiology
EISSN 1662-5188
ExternalDocumentID oai_doaj_org_article_742c1557af0143ae9d917d231c065a2e
PMC9413837
10_3389_fncom_2022_942979
GroupedDBID ---
29H
2WC
53G
5GY
5VS
8FE
8FH
9T4
AAFWJ
AAYXX
ABUWG
ACGFO
ACGFS
ACXDI
ADBBV
ADMLS
ADRAZ
AEGXH
AENEX
AFKRA
AFPKN
AIAGR
ALMA_UNASSIGNED_HOLDINGS
AOIJS
ARCSS
AZQEC
BAWUL
BBNVY
BCNDV
BENPR
BHPHI
BPHCQ
CITATION
CS3
DIK
E3Z
F5P
GROUPED_DOAJ
GX1
HCIFZ
HYE
KQ8
LK8
M2P
M48
M7P
M~E
O5R
O5S
OK1
OVT
P2P
PGMZT
PIMPY
PQQKQ
PROAC
RNS
RPM
TR2
3V.
7XB
88I
8FK
CCPQU
DWQXO
GNUQQ
PHGZM
PHGZT
PKEHL
PQEST
PQGLB
PQUKI
PRINS
Q9U
7X8
5PM
ID FETCH-LOGICAL-c470t-433ab2e5e345c00b85dc8bc047ae5dd24d31a18988ccd406d1bca2d58130fded3
IEDL.DBID DOA
ISSN 1662-5188
IngestDate Wed Aug 27 01:24:52 EDT 2025
Thu Aug 21 18:45:36 EDT 2025
Fri Jul 11 11:16:58 EDT 2025
Mon Jun 30 09:58:08 EDT 2025
Tue Jul 01 02:18:17 EDT 2025
Thu Apr 24 23:01:08 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c470t-433ab2e5e345c00b85dc8bc047ae5dd24d31a18988ccd406d1bca2d58130fded3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Edited by: Petia D. Koprinkova-Hristova, Institute of Information and Communication Technologies (BAS), Bulgaria
These authors have contributed equally to this work
Reviewed by: Pallavi Pandey, KR Mangalam University, India; Jing Jin, East China University of Science and Technology, China
OpenAccessLink https://doaj.org/article/742c1557af0143ae9d917d231c065a2e
PMID 36034935
PQID 2701312811
PQPubID 4424409
ParticipantIDs doaj_primary_oai_doaj_org_article_742c1557af0143ae9d917d231c065a2e
pubmedcentral_primary_oai_pubmedcentral_nih_gov_9413837
proquest_miscellaneous_2707874167
proquest_journals_2701312811
crossref_citationtrail_10_3389_fncom_2022_942979
crossref_primary_10_3389_fncom_2022_942979
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2022-08-12
PublicationDateYYYYMMDD 2022-08-12
PublicationDate_xml – month: 08
  year: 2022
  text: 2022-08-12
  day: 12
PublicationDecade 2020
PublicationPlace Lausanne
PublicationPlace_xml – name: Lausanne
PublicationTitle Frontiers in computational neuroscience
PublicationYear 2022
Publisher Frontiers Research Foundation
Frontiers Media S.A
Publisher_xml – name: Frontiers Research Foundation
– name: Frontiers Media S.A
References B44
Jia (B14)
Li (B22) 2018
Duan (B7) 2013
Jin (B16) 2020; 28
Lawhern (B19) 2016; 15
Ganin (B9) 2016; 17
Lindquist (B24) 2012; 16
Liu (B26) 2018; 275
Li (B23) 2021; 12
Martinovski (B29) 2009; 18
Zhang (B43) 2019; 49
Wang (B39) 2016
Stamos (B34) 2017; 22
Alhagry (B2) 2017; 8
Hao (B10) 2021; 39
Song (B31) 2019; 7
Li (B20) 2019; 7
Jain (B13) 2016
Jia (B15)
Jin (B17) 2021
B12
Soroush (B33) 2017; 4
B36
Li (B21) 2020; 13
Chen (B4) 2021; 58
Lotfi (B27) 2014; 59
Yu (B42) 2019; 27
Picard (B30) 2001; 23
Cowie (B6) 2002; 18
Waldron (B38) 1994; 17
Ma (B28) 2021; 1966
Christian (B5) 2014; 1
Britton (B3) 2006; 31
Liu (B25) 2019
Tao (B37) 2020; 99
Yang (B40) 2018
Etkin (B8) 2011; 15
Yin (B41) 2021; 100
Song (B32) 2018; 11
Hochreiter (B11) 1997; 9
Koelstra (B18) 2012; 3
Sulthan (B35) 2018; 16
Alarcao (B1) 2019; 10
References_xml – ident: B12
– volume: 99
  start-page: 3025777
  year: 2020
  ident: B37
  article-title: Eeg-based emotion recognition via channel-wise attention and self attention
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/TAFFC.2020.3025777
– volume: 13
  start-page: 354
  year: 2020
  ident: B21
  article-title: A novel bi-hemispheric discrepancy model for eeg emotion recognition
  publication-title: IEEE Trans. Cogn. Dev. Syst
  doi: 10.1109/TCDS.2020.2999337
– volume: 18
  start-page: 235
  year: 2009
  ident: B29
  article-title: Emotion as an argumentation engine: modeling the role of emotion in negotiation
  publication-title: Group Decis. Negotiat
  doi: 10.1007/s10726-008-9153-7
– start-page: 532
  volume-title: 2016 IEEE International Conference on Robotics and Automation (ICRA)
  year: 2016
  ident: B13
  article-title: Recurrent neural networks for driver activity anticipation via sensory-fusion architecture
  doi: 10.1109/ICRA.2016.7487478
– volume: 17
  start-page: 236
  year: 1994
  ident: B38
  article-title: Once more, with feeling: Reconsidering the role of emotion in work
  publication-title: Ann. Int. Commun. Assoc
– volume: 27
  start-page: 1292
  year: 2019
  ident: B42
  article-title: An asynchronous hybrid spelling approach based on eeg-eog signals for chinese character input
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng
  doi: 10.1109/TNSRE.2019.2914916
– volume: 1
  start-page: 66
  year: 2014
  ident: B5
  article-title: A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges
  publication-title: Brain Comput. Interfaces
  doi: 10.1080/2326263X.2014.912881
– volume: 31
  start-page: 397
  year: 2006
  ident: B3
  article-title: Neural correlates of social and nonsocial emotions: an fmri study
  publication-title: Neuroimage
  doi: 10.1016/j.neuroimage.2005.11.027
– volume: 16
  start-page: 33
  year: 2012
  ident: B24
  article-title: A functional architecture of the human brain: emerging insights from the science of emotion
  publication-title: Trends Cogn. Sci
  doi: 10.1016/j.tics.2012.09.005
– volume: 28
  start-page: 3
  year: 2020
  ident: B16
  article-title: The study of generic model set for reducing calibration time in p300-based brain-computer interface
  publication-title: IEEE Trans. Neural Syst. Rehabil. Eng
  doi: 10.1109/TNSRE.2019.2956488
– volume-title: IEEE Trans. Neural Netw. Learn. Syst
  year: 2021
  ident: B17
  article-title: Robust similarity measurement based on a novel time filter for SSVEPs detection
  doi: 10.1109/TNNLS.2021.3118468
– volume: 1966
  start-page: 012043
  year: 2021
  ident: B28
  article-title: Eeg emotion recognition based on optimal feature selection
  publication-title: J. Phys
  doi: 10.1088/1742-6596/1966/1/012043
– volume: 22
  start-page: 98
  year: 2017
  ident: B34
  article-title: Dreamer: a database for emotion recognition through eeg and ecg signals from wireless low-cost off-the-shelf devices
  publication-title: IEEE J. Biomed. Health Inform
  doi: 10.1109/JBHI.2017.2688239
– start-page: 81
  volume-title: 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER)
  year: 2013
  ident: B7
  article-title: Differential entropy feature for eeg-based emotion classification
  doi: 10.1109/NER.2013.6695876
– volume: 59
  start-page: 61
  year: 2014
  ident: B27
  article-title: Practical emotional neural networks
  publication-title: Neural Netw
  doi: 10.1016/j.neunet.2014.06.012
– volume: 16
  start-page: 315
  year: 2018
  ident: B35
  article-title: Emotion recognition using brain signals
  publication-title: Int. Conf. Intell. Circ. Syst
  doi: 10.1109/ICICS.2018.00071
– volume: 8
  start-page: 345
  year: 2017
  ident: B2
  article-title: Emotion recognition based on eeg using lstm recurrent neural network
  publication-title: Int. J. Adv. Comput. Sci. Appl
  doi: 10.14569/IJACSA.2017.081046
– volume: 12
  start-page: 494
  year: 2021
  ident: B23
  article-title: A bi-hemisphere domain adversarial neural network model for eeg emotion recognition
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/TAFFC.2018.2885474
– volume: 100
  start-page: 106954
  year: 2021
  ident: B41
  article-title: Eeg emotion recognition using fusion model of graph convolutional neural networks and lstm
  publication-title: Appl. Soft. Comput
  doi: 10.1016/j.asoc.2020.106954
– start-page: 1324
  volume-title: Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence IJCAI-PRICAI-20
  ident: B15
  article-title: Graphsleepnet: “Adaptive spatial-temporal graph convolutional networks for sleep stage classification,”
– volume-title: Neural Information Processing
  year: 2019
  ident: B25
  article-title: Sparse graphic attention LSTM for EEG emotion recognition
  doi: 10.1007/978-3-030-36808-1_75
– volume: 23
  start-page: 1175
  year: 2001
  ident: B30
  article-title: Toward machine emotional intelligence: analysis of affective physiological state
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell
  doi: 10.1109/34.954607
– ident: B36
– start-page: 2909
  volume-title: MM '20: The 28th ACM International Conference on Multimedia
  ident: B14
  article-title: Sst-emotionnet: spatial-spectral-temporal based attention 3d dense network for EEG emotion recognition
– volume: 7
  start-page: 155724
  year: 2019
  ident: B20
  article-title: The fusion of electroencephalography and facial expression for continuous emotion recognition
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2019.2949707
– volume: 3
  start-page: 18
  year: 2012
  ident: B18
  article-title: Deap: a database for emotion analysis ;using physiological signals
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/T-AFFC.2011.15
– volume: 9
  start-page: 1735
  year: 1997
  ident: B11
  article-title: Long short-term memory
  publication-title: Neural Comput
  doi: 10.1162/neco.1997.9.8.1735
– start-page: 2378
  volume-title: CVPR 2016 IEEE Conference on Computer Vision and Pattern Recognition 2016
  year: 2016
  ident: B39
  article-title: Recurrent face aging
– volume: 4
  start-page: 118
  year: 2017
  ident: B33
  article-title: A review on eeg signals based emotion recognition
  publication-title: Int. Clin. Neurosci. J
  doi: 10.15171/icnj.2017.01
– ident: B44
– volume: 58
  start-page: 175
  year: 2021
  ident: B4
  article-title: Emotion recognition of EEG based on ensemble capsnet
  publication-title: Comput. Eng. Appl
  doi: 10.3778/j.issn.1002-8331.2010-0263
– volume: 17
  start-page: 2096
  year: 2016
  ident: B9
  article-title: Domain-adversarial training of neural networks
  publication-title: J. Mach. Learn. Res
  doi: 10.48550/arXiv.1505.07818
– volume: 15
  start-page: 85
  year: 2011
  ident: B8
  article-title: Emotional processing in anterior cingulate and medial prefrontal cortex
  publication-title: Trends Cognit
  doi: 10.1016/j.tics.2010.11.004
– volume: 15
  start-page: 056013.1
  year: 2016
  ident: B19
  article-title: EEGNet: a compact convolutional network for eeg-based brain-computer interfaces
  publication-title: J. Neural Eng
  doi: 10.1088/1741-2552/aace8c
– volume: 11
  start-page: 532
  year: 2018
  ident: B32
  article-title: Eeg emotion recognition using dynamical graph convolutional neural networks
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/BIBM.2018.8621147
– volume: 10
  start-page: 374
  year: 2019
  ident: B1
  article-title: Emotions recognition using EEG signals: a survey
  publication-title: IEEE Trans. Affect. Comput
  doi: 10.1109/TAFFC.2017.2714671
– start-page: 1
  volume-title: 2018 International Joint Conference on Neural Networks (IJCNN)
  year: 2018
  ident: B40
  article-title: Emotion recognition from multi-channel eeg through parallel convolutional recurrent neural network
– volume: 39
  start-page: 347
  year: 2021
  ident: B10
  article-title: Emotion classification based on eeg deep learning
  publication-title: J. Appl. Sci
  doi: 10.3969/j.issn.0255-8297.2021.03.001
– volume: 7
  start-page: 12177
  year: 2019
  ident: B31
  article-title: Mped: A multi-modal physiological emotion database for discrete emotion recognition
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2019.2891579
– volume: 18
  start-page: 32
  year: 2002
  ident: B6
  article-title: Emotion recognition in human-computer interaction
  publication-title: IEEE Signal Process. Mag
  doi: 10.1109/79.911197
– start-page: 1561
  volume-title: Twenty-Seventh International Joint Conference on Artificial Intelligence IJCAI-18
  year: 2018
  ident: B22
  article-title: A novel neural network model based on cerebral hemispheric asymmetry for EEG emotion recognition
– volume: 275
  start-page: 288
  year: 2018
  ident: B26
  article-title: Deep learning based on batch normalization for p300 signal detection
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2017.08.039
– volume: 49
  start-page: 839
  year: 2019
  ident: B43
  article-title: Spatial-temporal recurrent neural network for emotion recognition
  publication-title: IEEE Trans. Cybern
  doi: 10.1109/TCYB.2017.2788081
SSID ssj0062650
Score 2.3501103
Snippet Objectve: Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness....
Emotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However, the...
ObjectveEmotional brain-computer interface can recognize or regulate human emotions for workload detection and auxiliary diagnosis of mental illness. However,...
SourceID doaj
pubmedcentral
proquest
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 942979
SubjectTerms Accuracy
Brain
Classification
Computer applications
Datasets
Deep learning
depthwise separable convolution
EEG
Electrodes
electroencephalogram (EEG)
Electroencephalography
emotional brain-computer interface
Emotions
Experiments
Implants
long short-term memory
Mental disorders
Neural networks
neurocognitive
Neuroscience
SummonAdditionalLinks – databaseName: ProQuest Central
  dbid: BENPR
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Nb9QwELWgvXBBhYIIFGQkxAHJNPE6scMFbdFWFUIrhKjUm2WPHUACp2zTQ_89M4l3aS49RXKcr7E982Y8ecPYm9AZp2LpBJpuLZT3jfCAQK4ybXBN1xBpHWVbrJuzc_X5or7IAbernFa51Ymjog49UIz8WGpihpGmqj5e_hVUNYp2V3MJjftsH1WwQedr_2S1_vptq4sRrdd5LxNdsfa4S5Qigv6-fN-iIqb8rVvWaCTtnyHNeZ7kLcNzesAeZsTIl9MQP2L3YnrMDpcJveU_N_wtH3M4x-D4Ifuykqv1Og4f-DLxmIIYeoEHTqyVeI805XxzBKo8TvV7sNVTlQgBubwDJwKJTecgPmHnp6vvn85ErpcgQOlyoJ-fnJexjgtVQ1l6UwcwHkqlXaxDkCosKoeDYAxAQEMeKg9OhtqgHetCDIunbC_1KT5jPJSdh7ioAZRRUrs2dAFFr7oGWu18VbByKzcLmUycalr8tuhUkKjtKGpLoraTqAv2bnfJ5cSkcVfnExqMXUciwR4b-s0Pm9eURa8eEA5p1xFJoYttQN8zIGAFxFVOxoIdbYfS5pV5Zf_Po4K93p3GNUUbJS7F_nrsg3oMoaoumJ5NgdkLzc-kXz9Hdu4WYQF6_c_vfvgL9oA-l6LTlTxie8PmOr5EeDP4V3kO_wOq8P09
  priority: 102
  providerName: ProQuest
– databaseName: Scholars Portal Journals: Open Access
  dbid: M48
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3da9RAEB9q--KLqG0xWmWF4kMhNdlsshtB5JQrpdR76kHflv1Uoe7pmYL9751JcoeB4pNPgf0I2Zndnd_sTn4DcOyjMiIUJkfTLXNhbZNbh0CuVK03TWyItI6iLRbN-VJcXNfXO7BJbzUK8Ne9rh3lk1qub05__7z7gAv-PXmcaG_fxkSBH-jF89MWt1fZPoA9NEyS1ulnsb1UQOhejxeb93ebmKaewX8CO6dBk39ZobPH8GiEj2w26PsJ7IT0FPZnCV3n73fsDesDOvuT8n24nPP5YhG6d2yWWEg-71Y5PhhRWOI70hAAzhC1sjAk88FSSykjcjfmemDEJrGOxoUDWJ7Nrz6d52PyhNwJWXT0J5SxPNShErUrCqtq75R1hZAm1N5z4avSoEaUcs6jVfeldYb7WqFRiz746hB20yqFZ8B8Ea0LVe2cUIJL0_roFfpRsXGtNLbMoNjITbuRWZwSXNxo9DBI1LoXtSZR60HUGZxsu_wYaDX-1fgjKWPbkBix-4LV-oseF5hGF98hNpImEmOhCa1HR9QjenUIsgwPGRxtVKk3s0xzSXRDHAeTwettNS4wujUxKaxu-za4qSFulRnIyRSYfNC0Jn372lN1t4gRVCWf_48RvICHJBQ60C75Eex269vwEhFRZ1_18_wP-6kMjg
  priority: 102
  providerName: Scholars Portal
Title E2ENNet: An end-to-end neural network for emotional brain-computer interface
URI https://www.proquest.com/docview/2701312811
https://www.proquest.com/docview/2707874167
https://pubmed.ncbi.nlm.nih.gov/PMC9413837
https://doaj.org/article/742c1557af0143ae9d917d231c065a2e
Volume 16
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Bi9QwFH7IevEi6ipW1yWCeBDqtpm0Sb3NyqyL6CDiwtxCkpegoBlZuwf_ve-1nWF60YuXFpq0JK-ved_Xvn4P4AUm41SsXEmhW5fK-7b0gYBcbTp0bWpZtI6zLdbt5ZV6v2k2B6W-OCdslAceDXdG1C1QzNMusRKdix0SwUBCJYGCp5ORV1-KeTsyNa7BhNKb6RsmUbDuLGVODSGeL193tABz3tZBFBrE-mcIc54feRBwLu7B3QkpiuU4wvtwK-YHcLzMxJJ__BYvxZC7ObwUP4YPK7lar2P_RiyziBnLflvSTrBaJV0jj7neggCqiGPdHjrquTpEGaayDoKFI66TC_EhXF2svry9LKc6CWVQuur5pyfnZWziQjWhqrxpMBgfKqVdbBClwkXtyPjGhIAUwLH2wUlsDMWvhBEXj-Aob3N8DAKr5ENcNCEoo6R2HSY0RJlSGzrtfF1AtbObDZOIONey-G6JTLCp7WBqy6a2o6kLeLU_5eeooPG3zud8M_YdWfx6OEAuYSeXsP9yiQJOdrfSTk_kLys1KwtJmkwBz_fN9CzxBxKX4_Zm6EPrF0FUXYCeucBsQPOW_O3roMrdERwgtv_kf8zgKdxho_C761qewFF_fROfEfjp_SncPl-tP30-Hfydtu82NW0_KvMHuNsHKg
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6V7QEuCCiIQAEjAQek0MRx4gQJoS1staVLhFAr9eb6FUCCbNmmQv1T_EZm8liaS289RUqcxBmPZ76xJ98AvHBVroWPdIiuW4bCmCw0FoFcnBdOZ1VGpHWUbVFm8yPx6Tg93oC_w78wlFY52MTWULulpTXyHS6JGYbncfz-9HdIVaNod3UoodGpxYG_-IMh29m7_Y84vi8535sdfpiHfVWB0AoZNfSLkDbcpz4RqY0ik6fO5sZGQmqfOseFS2KNXc1zax26Oxcbq7lLc7T2lfMuwefegE2RYCgzgc3dWfnl62D7MTpI-71TDP2KnaqmlBSOfvJNgYaf8sUueb-2SMAI2Y7zMi85ur07cLtHqGzaqdRd2PD1Pdia1hid_7pgr1ibM9ouxm_BYsZnZembt2xaM1-7sFmGeGDEkonPqLscc4bAmPmuXhCeNVSVIrR9OQlGhBWrSlt_H46uRZIPYFIva_8QmIsqY32SWitywaUuXOVwqEWV2UJqEwcQDXJTticvpxoaPxUGMSRq1YpakahVJ-oAXq9vOe2YO65qvEuDsW5IpNvtieXqm-rnsJKCW4RfUldEiqh94TDWdQiQLeI4zX0A28NQqt4SnKn_ehvA8_VlnMO0MaNrvzxv26DdRGgsA5AjFRh1aHyl_vG9ZQMvEIbkiXx09cufwc354eeFWuyXB4_hFn06rYzHfBsmzercP0Fo1ZinvT4zOLnuKfQP1mk7Mw
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3da9RAEB_qFcQXUasYrbqC-iDESzabbCKIXO0drS1HEQt9i_tZBU3qNUX6r_nXOZOPs3npW58CySbZzM7O_GZ38huAV9bnSrhIhei6ZSi0zkJtEMjFeWFV5jMiraNsi2W2dyw-n6QnG_B3-BeG0ioHm9gaalsbWiOfcknMMDyP46nv0yKOdhcfz36HVEGKdlqHchqdihy4yz8Yvp1_2N_FsX7N-WL-9dNe2FcYCI2QUUO_CynNXeoSkZoo0nlqTa5NJKRyqbVc2CRW2O08N8ai67OxNorbNEfL762zCT73FmxKjIqiCWzuzJdHXwY_gJFC2u-jYhhYTH1F6Skcfea7Ap0A5Y5d8YRtwYARyh3naF5xeot7cLdHq2zWqdd92HDVA9iaVRip_7pkb1ibP9ouzG_B4ZzPl0vXvGezirnKhk0d4oERYyY-o-ryzRmCZOa62kF4VlOFitD0pSUYkVesvDLuIRzfiCQfwaSqK_cYmI28Ni5JjRG54FIV1lscduEzU0il4wCiQW6l6YnMqZ7GzxIDGhJ12Yq6JFGXnagDeLu-5axj8biu8Q4NxrohEXC3J-rVadnP51IKbhCKSeWJIFG5wmLcaxEsG8R0irsAtoehLHurcF7-1-EAXq4v43ymTRpVufqibYM2FGGyDECOVGDUofGV6sf3lhm8QEiSJ_LJ9S9_Abdx6pSH-8uDp3CHvpwWyWO-DZNmdeGeIcpq9PNenRl8u-kZ9A8v1j9o
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=E2ENNet%3A+An+end-to-end+neural+network+for+emotional+brain-computer+interface&rft.jtitle=Frontiers+in+computational+neuroscience&rft.au=Zhichao+Han&rft.au=Hongli+Chang&rft.au=Xiaoyan+Zhou&rft.au=Jihao+Wang&rft.date=2022-08-12&rft.pub=Frontiers+Media+S.A&rft.eissn=1662-5188&rft.volume=16&rft_id=info:doi/10.3389%2Ffncom.2022.942979&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_742c1557af0143ae9d917d231c065a2e
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1662-5188&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1662-5188&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1662-5188&client=summon