Developing an EEG-Based Emotion Recognition Using Ensemble Deep Learning Methods and Fusion of Brain Effective Connectivity Maps
The objective of this paper is to develop a novel emotion recognition system from electroencephalogram (EEG) signals using effective connectivity and deep learning methods. Emotion recognition is an important task for various applications such as human-computer interaction and, mental health diagnos...
Saved in:
Published in | IEEE access Vol. 12; pp. 50949 - 50965 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The objective of this paper is to develop a novel emotion recognition system from electroencephalogram (EEG) signals using effective connectivity and deep learning methods. Emotion recognition is an important task for various applications such as human-computer interaction and, mental health diagnosis. The paper aims to improve the accuracy and robustness of emotion recognition by combining different effective connectivity (EC) methods and pre-trained convolutional neural networks (CNNs), as well as long short-term memory (LSTM). EC methods measure information flow in the brain during emotional states using EEG signals. We used three EC methods: transfer entropy (TE), partial directed coherence (PDC), and direct directed transfer function (dDTF). We estimated a fused image from these methods for each five-second window of 32-channel EEG signals. Then, we applied six pre-trained CNNs to classify the images into four emotion classes based on the two-dimensional valence-arousal model. We used the leave-one-subject-out cross-validation strategy to evaluate the classification results. We also used an ensemble model to select the best results from the best pre-trained CNNs using the majority voting approach. Moreover, we combined the CNNs with LSTM to improve recognition performance. We achieved the average accuracy and F-score of 98.76%, 98.86%, 98.66 and 98.88% for classifying emotions using DEAP and MAHNOB-HCI datasets, respectively. Our results show that fused images can increase the accuracy and that an ensemble and combination of pre-trained CNNs and LSTM can achieve high accuracy for automated emotion recognition. Our model outperformed other state-of-the-art systems using the same datasets for four-class emotion classification. |
---|---|
AbstractList | The objective of this paper is to develop a novel emotion recognition system from electroencephalogram (EEG) signals using effective connectivity and deep learning methods. Emotion recognition is an important task for various applications such as human-computer interaction and, mental health diagnosis. The paper aims to improve the accuracy and robustness of emotion recognition by combining different effective connectivity (EC) methods and pre-trained convolutional neural networks (CNNs), as well as long short-term memory (LSTM). EC methods measure information flow in the brain during emotional states using EEG signals. We used three EC methods: transfer entropy (TE), partial directed coherence (PDC), and direct directed transfer function (dDTF). We estimated a fused image from these methods for each five-second window of 32-channel EEG signals. Then, we applied six pre-trained CNNs to classify the images into four emotion classes based on the two-dimensional valence-arousal model. We used the leave-one-subject-out cross-validation strategy to evaluate the classification results. We also used an ensemble model to select the best results from the best pre-trained CNNs using the majority voting approach. Moreover, we combined the CNNs with LSTM to improve recognition performance. We achieved the average accuracy and F-score of 98.76%, 98.86%, 98.66 and 98.88% for classifying emotions using DEAP and MAHNOB-HCI datasets, respectively. Our results show that fused images can increase the accuracy and that an ensemble and combination of pre-trained CNNs and LSTM can achieve high accuracy for automated emotion recognition. Our model outperformed other state-of-the-art systems using the same datasets for four-class emotion classification. |
Author | Shoeibi, Afshin Jafari, Mahboobeh Acharya, U. Rajendra Bagherzadeh, Sara Shalbaf, Ahmad Tan, Ru-San |
Author_xml | – sequence: 1 givenname: Sara orcidid: 0000-0003-2980-8866 surname: Bagherzadeh fullname: Bagherzadeh, Sara organization: Department of Biomedical Engineering, Islamic Azad University Science and Research Branch, Tehran, Iran – sequence: 2 givenname: Ahmad orcidid: 0000-0002-1595-7281 surname: Shalbaf fullname: Shalbaf, Ahmad email: shalbaf@sbmu.ac.ir organization: Department of Biomedical Engineering and Medical Physics, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran – sequence: 3 givenname: Afshin orcidid: 0000-0003-0635-6799 surname: Shoeibi fullname: Shoeibi, Afshin organization: Data Science and Computational Intelligence Institute, University of Granada, Granada, Spain – sequence: 4 givenname: Mahboobeh orcidid: 0000-0001-7964-4033 surname: Jafari fullname: Jafari, Mahboobeh organization: School of Mathematics, Physics and Computing, University of Southern Queensland, Toowoomba, QLD, Australia – sequence: 5 givenname: Ru-San orcidid: 0000-0003-2086-6517 surname: Tan fullname: Tan, Ru-San organization: National Heart Centre Singapore, Hospital Drive, Singapore – sequence: 6 givenname: U. Rajendra surname: Acharya fullname: Acharya, U. Rajendra organization: School of Mathematics, Physics and Computing, University of Southern Queensland, Toowoomba, QLD, Australia |
BookMark | eNp9UU1r3DAUNCWFpkl-QXsQ9Ozts-UP6Zg4ThrYEGias9BKT1stXsmVvIHc-tMrr1MIPVQXDfNmhsebj9mJ8w6z7FMBq6IA_vWy6_rHx1UJZbWilFUU6LvstCwantOaNidv8IfsIsYdpMcSVben2e9rfMbBj9ZtiXSk72_zKxlRk37vJ-sd-Y7Kb5094qc4y3oXcb8ZkFwjjmSNMriZvsfpp9cxpWhyc4iz3htyFaRNscagmuwzks47d4R2eiH3cozn2Xsjh4gXr_9Z9nTT_-i-5euH27vucp2rCviUa8OxaYykWGoKplVKt5xDCZSB1gahbqgpWa2waEAXUjdQowZUbaMr1IqeZXdLrvZyJ8Zg9zK8CC-tOBI-bIUMk1UDCtkapSnnumZ1tUHg1FDAlunasAI3MmV9WbLG4H8dME5i5w_BpfUFhQqgpozypOKLSgUfY0AjlJ3kfMgpHWUQBYi5P7H0J-b-xGt_yUv_8f7d-P-uz4vLIuIbR8VYlcZ_AKYWqe4 |
CODEN | IAECCG |
CitedBy_id | crossref_primary_10_1016_j_inffus_2025_102982 crossref_primary_10_1038_s41598_025_93241_9 crossref_primary_10_1016_j_heliyon_2025_e41767 crossref_primary_10_1016_j_bspc_2024_107473 crossref_primary_10_1007_s13198_024_02591_6 crossref_primary_10_1016_j_bspc_2024_106812 crossref_primary_10_1007_s10548_025_01106_1 crossref_primary_10_1007_s10586_024_04994_3 crossref_primary_10_1038_s41598_024_80448_5 crossref_primary_10_1016_j_knosys_2024_112270 crossref_primary_10_1109_ACCESS_2024_3460393 crossref_primary_10_1007_s12031_025_02329_4 crossref_primary_10_1016_j_heliyon_2024_e36411 crossref_primary_10_3390_app15052328 crossref_primary_10_1007_s11760_025_03896_0 |
Cites_doi | 10.1002/cpe.4446 10.1002/hbm.20263 10.1002/9781118914564 10.1016/j.compbiomed.2022.105570 10.1109/ACCESS.2022.3155647 10.1016/j.bspc.2021.102648 10.1016/j.artmed.2021.102210 10.3390/s22093248 10.1109/CVPR.2017.243 10.1016/j.eij.2019.10.002 10.3389/fnbot.2019.00037 10.1145/3065386 10.3390/s23031404 10.1016/j.bspc.2021.103289 10.1088/1741-2552/ab0ab5 10.1037/h0077714 10.1016/j.compbiomed.2015.09.019 10.1109/TCDS.2022.3207350 10.3389/fnhum.2015.00570 10.1016/S0165-0270(03)00052-9 10.1016/j.jksuci.2019.11.003 10.1093/oso/9780195169157.003.0002 10.1142/S0129065721500222 10.1142/S0129065722500241 10.1016/j.cmpb.2023.107380 10.1007/s11571-019-09556-7 10.1016/j.asoc.2020.106954 10.1109/LSP.2022.3179946 10.3389/fnsys.2020.00043 10.1260/2040-2295.6.1.55 10.1016/j.bspc.2021.103361 10.1016/j.chb.2016.01.005 10.1016/j.engappai.2022.105349 10.1109/T-AFFC.2011.15 10.3390/s19214736 10.1016/j.bbr.2015.10.036 10.1007/s12021-013-9186-1 10.1016/j.bbe.2019.01.004 10.1109/TNSRE.2008.2010472 10.1109/JSEN.2022.3172133 10.1109/CVPR.2016.308 10.1016/j.inffus.2021.07.007 10.1109/CVPR.2018.00907 10.1016/j.bspc.2022.103544 10.1016/j.imavis.2012.10.002 10.1007/s11571-021-09756-0 10.1109/ACCESS.2021.3091487 10.1007/s13042-021-01414-5 10.1080/02699930903274322 10.1016/j.engappai.2022.105347 10.1016/j.bbe.2020.04.005 10.1109/T-AFFC.2011.25 10.1371/journal.pone.0242014 10.1016/j.ipm.2009.03.002 10.1109/TAFFC.2022.3145623 10.1142/S0129065721500325 10.1109/ACCESS.2022.3193768 10.1007/s11042-020-09354-y 10.1109/CVPR.2016.90 10.3389/fninf.2021.777977 10.1016/j.knosys.2023.110372 10.1016/j.compbiomed.2021.104696 10.1109/ACCESS.2023.3245830 10.1016/j.compbiomed.2023.106537 10.1016/j.compbiomed.2011.06.020 10.1109/TNNLS.2020.3008938 10.1016/j.bspc.2022.103547 10.1016/j.bbe.2021.06.006 10.1007/s11571-019-09553-w |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D DOA |
DOI | 10.1109/ACCESS.2024.3384303 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access (Activated by CARLI) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts METADEX Technology Research Database Materials Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef Materials Research Database Engineered Materials Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace METADEX Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Materials Research Database |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: RIE name: IEEE Xplore url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 2169-3536 |
EndPage | 50965 |
ExternalDocumentID | oai_doaj_org_article_a7fcd399d5854be093f30e78d5f81eba 10_1109_ACCESS_2024_3384303 10488403 |
Genre | orig-research |
GrantInformation_xml | – fundername: Shahid Beheshti University of Medical Sciences grantid: 43004477 funderid: 10.13039/501100005851 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR ABAZT ABVLG ACGFS ADBBV AGSQL ALMA_UNASSIGNED_HOLDINGS BCNDV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL GROUPED_DOAJ IPLJI JAVBF KQ8 M43 M~E O9- OCL OK1 RIA RIE RNS AAYXX CITATION RIG 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c409t-df9e66fa3e2d30f7ccd799020380ddfe0563f285ce160d1ad605ed0ec76d4edc3 |
IEDL.DBID | DOA |
ISSN | 2169-3536 |
IngestDate | Wed Aug 27 01:32:09 EDT 2025 Sun Jun 29 12:22:55 EDT 2025 Thu Apr 24 22:56:47 EDT 2025 Tue Jul 01 04:14:28 EDT 2025 Wed Aug 27 02:17:04 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://creativecommons.org/licenses/by-nc-nd/4.0 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c409t-df9e66fa3e2d30f7ccd799020380ddfe0563f285ce160d1ad605ed0ec76d4edc3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-1595-7281 0000-0003-2086-6517 0000-0003-2980-8866 0000-0001-7964-4033 0000-0003-0635-6799 |
OpenAccessLink | https://doaj.org/article/a7fcd399d5854be093f30e78d5f81eba |
PQID | 3040053839 |
PQPubID | 4845423 |
PageCount | 17 |
ParticipantIDs | ieee_primary_10488403 crossref_primary_10_1109_ACCESS_2024_3384303 proquest_journals_3040053839 crossref_citationtrail_10_1109_ACCESS_2024_3384303 doaj_primary_oai_doaj_org_article_a7fcd399d5854be093f30e78d5f81eba |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20240000 2024-00-00 20240101 2024-01-01 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – year: 2024 text: 20240000 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE access |
PublicationTitleAbbrev | Access |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref12 ref56 ref15 ref59 ref14 ref58 ref53 ref52 ref11 ref55 ref10 ref54 ref17 ref16 ref19 ref18 Chollet (ref63) 2016 ref51 ref50 ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref34 ref37 ref36 ref31 ref30 ref33 Mullen (ref20) 2010 ref32 ref2 ref1 ref39 ref38 Tan (ref61) ref71 ref70 ref72 ref24 ref68 ref23 ref67 ref26 ref25 ref69 ref64 ref22 ref66 ref21 ref65 ref28 ref27 ref29 ref60 ref62 |
References_xml | – ident: ref21 doi: 10.1002/cpe.4446 – ident: ref56 doi: 10.1002/hbm.20263 – start-page: 1 volume-title: Source information flow toolbox (SIFT) year: 2010 ident: ref20 – ident: ref65 doi: 10.1002/9781118914564 – ident: ref16 doi: 10.1016/j.compbiomed.2022.105570 – ident: ref41 doi: 10.1109/ACCESS.2022.3155647 – ident: ref11 doi: 10.1016/j.bspc.2021.102648 – ident: ref19 doi: 10.1016/j.artmed.2021.102210 – ident: ref35 doi: 10.3390/s22093248 – ident: ref60 doi: 10.1109/CVPR.2017.243 – ident: ref22 doi: 10.1016/j.eij.2019.10.002 – ident: ref28 doi: 10.3389/fnbot.2019.00037 – ident: ref58 doi: 10.1145/3065386 – ident: ref46 doi: 10.3390/s23031404 – ident: ref72 doi: 10.1016/j.bspc.2021.103289 – ident: ref26 doi: 10.1088/1741-2552/ab0ab5 – ident: ref47 doi: 10.1037/h0077714 – ident: ref5 doi: 10.1016/j.compbiomed.2015.09.019 – ident: ref30 doi: 10.1109/TCDS.2022.3207350 – ident: ref52 doi: 10.3389/fnhum.2015.00570 – ident: ref54 doi: 10.1016/S0165-0270(03)00052-9 – ident: ref43 doi: 10.1016/j.jksuci.2019.11.003 – ident: ref48 doi: 10.1093/oso/9780195169157.003.0002 – ident: ref68 doi: 10.1142/S0129065721500222 – ident: ref4 doi: 10.1142/S0129065722500241 – ident: ref45 doi: 10.1016/j.cmpb.2023.107380 – ident: ref55 doi: 10.1007/s11571-019-09556-7 – year: 2016 ident: ref63 article-title: Xception: Deep learning with depthwise separable convolutions publication-title: arXiv:1610.02357 – ident: ref29 doi: 10.1016/j.asoc.2020.106954 – ident: ref40 doi: 10.1109/LSP.2022.3179946 – ident: ref25 doi: 10.3389/fnsys.2020.00043 – ident: ref51 doi: 10.1260/2040-2295.6.1.55 – ident: ref38 doi: 10.1016/j.bspc.2021.103361 – ident: ref10 doi: 10.1016/j.chb.2016.01.005 – ident: ref12 doi: 10.1016/j.engappai.2022.105349 – ident: ref1 doi: 10.1109/T-AFFC.2011.15 – ident: ref27 doi: 10.3390/s19214736 – ident: ref3 doi: 10.1016/j.bbr.2015.10.036 – ident: ref53 doi: 10.1007/s12021-013-9186-1 – ident: ref9 doi: 10.1016/j.bbe.2019.01.004 – ident: ref57 doi: 10.1109/TNSRE.2008.2010472 – ident: ref34 doi: 10.1109/JSEN.2022.3172133 – ident: ref62 doi: 10.1109/CVPR.2016.308 – ident: ref39 doi: 10.1016/j.inffus.2021.07.007 – ident: ref64 doi: 10.1109/CVPR.2018.00907 – ident: ref32 doi: 10.1016/j.bspc.2022.103544 – ident: ref50 doi: 10.1016/j.imavis.2012.10.002 – ident: ref33 doi: 10.1007/s11571-021-09756-0 – ident: ref17 doi: 10.1109/ACCESS.2021.3091487 – ident: ref36 doi: 10.1007/s13042-021-01414-5 – ident: ref49 doi: 10.1080/02699930903274322 – ident: ref14 doi: 10.1016/j.engappai.2022.105347 – ident: ref23 doi: 10.1016/j.bbe.2020.04.005 – ident: ref2 doi: 10.1109/T-AFFC.2011.25 – ident: ref6 doi: 10.1371/journal.pone.0242014 – ident: ref66 doi: 10.1016/j.ipm.2009.03.002 – ident: ref42 doi: 10.1109/TAFFC.2022.3145623 – start-page: 6105 volume-title: Proc. 36th Int. Conf. Mach. Learn. ident: ref61 article-title: EfficientNet: Rethinking model scaling for convolutional neural networks – ident: ref67 doi: 10.1142/S0129065721500325 – ident: ref44 doi: 10.1109/ACCESS.2022.3193768 – ident: ref24 doi: 10.1007/s11042-020-09354-y – ident: ref59 doi: 10.1109/CVPR.2016.90 – ident: ref15 doi: 10.3389/fninf.2021.777977 – ident: ref69 doi: 10.1016/j.knosys.2023.110372 – ident: ref18 doi: 10.1016/j.compbiomed.2021.104696 – ident: ref70 doi: 10.1109/ACCESS.2023.3245830 – ident: ref71 doi: 10.1016/j.compbiomed.2023.106537 – ident: ref8 doi: 10.1016/j.compbiomed.2011.06.020 – ident: ref31 doi: 10.1109/TNNLS.2020.3008938 – ident: ref37 doi: 10.1016/j.bspc.2022.103547 – ident: ref7 doi: 10.1016/j.bbe.2021.06.006 – ident: ref13 doi: 10.1007/s11571-019-09553-w |
SSID | ssj0000816957 |
Score | 2.4165294 |
Snippet | The objective of this paper is to develop a novel emotion recognition system from electroencephalogram (EEG) signals using effective connectivity and deep... |
SourceID | doaj proquest crossref ieee |
SourceType | Open Website Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 50949 |
SubjectTerms | Accuracy Arousal Artificial neural networks Brain modeling Datasets Deep learning Effective connectivity Electroencephalography Emotion recognition Emotional factors Emotions Feature extraction Human-computer interface Image classification Information flow Long short term memory Machine learning Medical imaging Time-frequency analysis Transfer functions Transfer learning Two dimensional models |
SummonAdditionalLinks | – databaseName: IEEE Xplore dbid: RIE link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwELWgJzjwWcRCQT5wJIsTO3Z87G6zVEjtAVGpN8uxxxwo2VV398KJn874I6sKBOIWRXbiaMYev8n4PULeccmCVUxVg-9cJVrFKozzstIQU_nOSu4S2-elPL8Sn67b63JYPZ2FAYBUfAbzeJn-5fu128dUGc5wdDcRuT3vI3LLh7UOCZWoIKFbVZiFaqY_nC6X-BGIARsxRyQm-KSMVaJPIukvqip_LMUpvqwek8tpZLms5Nt8vxvm7sdvpI3_PfQn5FHZadLT7BpPyT0Yn5GHd_gHn5OfZ4cjU9SOtO8_VguMap72WdyHfp7Ki_A6FRfQftzC9-EG6BnAhhZ21q_0IglRb_Epnq72MQNH14Euov4EzQTJuKrSVFXjsl4FvbCb7TG5WvVfludVkWSoHALBXeWDBimD5dB4zoJyziuMZw3jHfM-AG6neGi61kEtma-tR7QEnoFT0gvwjr8gR-N6hJeE6pb7EBrEM9Etuk4LKWorutYiinEizEgzmcq4wlceZTNuTMItTJtsXxPta4p9Z-T9odMm03X8u_ki-sChaeTaTjfQdqZMXWNVcB73cR6RlRiAaR44A9X5NnQ1DHZGjqO977wvm3pGTiaXMmVh2BoeF00MMly_-ku31-RBHGJO85yQo93tHt7gxmc3vE0O_wsD7v8K priority: 102 providerName: IEEE |
Title | Developing an EEG-Based Emotion Recognition Using Ensemble Deep Learning Methods and Fusion of Brain Effective Connectivity Maps |
URI | https://ieeexplore.ieee.org/document/10488403 https://www.proquest.com/docview/3040053839 https://doaj.org/article/a7fcd399d5854be093f30e78d5f81eba |
Volume | 12 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT9wwELUQJ3qoCqXqtoB84NgUJ3Zs58guWVAleqhA4mY59rgXGlbd5c5PZ_yxq5WQ4MItihI79kxm5ln2e4SccsmCVUxVg9euEq1iFeZ5WXUQl_Kdldwlts_f8upW_Lpr77akvuKesEwPnCfuzKrgPGZRj3WtGAABeOAMlPZt0DUMqTTCnLcFplIM1rXsWlVohmrWnZ3PZjgiBISN-ImwTPC1TFZJRYmxv0isvIjLKdnMP5GPpUqk5_nr9skOjAfkwxZ34GfydLE57kTtSPv-sppiRvK0z8I89M96axBep40BtB-X8G-4B3oBsKCFWfUvvU4i0ktsxdP5Y1w9ow-BTqN2BM3kxhgRadoR47LWBL22i-UhuZ33N7OrqsgpVA5B3KryoQMpg-XQeM6Ccs4rzEUN45p5HwBLIR4a3TqoJfO19Yh0wDNwSnoB3vEvZHd8GOEroV3LfQgNYpFoUq07IUVthW4tIhAnwoQ065k1rnCNR8mLe5MwB-tMNoeJ5jDFHBPyY_PSIlNtvP74NJps82jkyU430HtM8R7zlvdMyGE0-FZ_GNBEbPxo7QGm_NRLw2PAwwTBu2_v0fd3shfHk9dzjsju6v8jHGOFsxpOkjOfpMOIzwng9wE |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwELVQewAOUKCIhRZ84EgWJ3ac5NjdZlmguwfUSr1Zjj3mQMmu2N0LJ34644-sqiIQtyiyE0cznvFMZt4j5C2XzOmKVVlna5OJsmIZ-nmZNeBT-UZLbgLa51LOr8Sn6_I6NauHXhgACMVnMPaX4V--XZmdT5XhDkd1Ex7b8xAdf5nHdq19SsVzSDRllbCFcta8P5tO8TMwCizEGGMxwQdurOR_Akx_4lX5wxgHDzN7TJbD2mJhybfxbtuNzc87sI3_vfgj8iidNelZVI4n5B70T8nDWwiEz8iv833TFNU9bdsP2QT9mqVtpPehX4YCI7wO5QW07TfwvbsBeg6wpgmf9StdBCrqDT7F0tnO5-DoytGJZ6CgESIZ7SoNdTUmMlbQhV5vjsnVrL2czrNEypAZDAW3mXUNSOk0h8Jy5ipjbIUerWC8ZtY6wAMVd0VdGsgls7m2GC-BZWAqaQVYw5-Tg37VwwtCm5Jb5wqMaLxi1HUjpMi1qEuNcYwRbkSKQVTKJMRyT5xxo0LkwhoV5au8fFWS74i8209aR8COfw-feB3YD_Vo2-EGyk6lzat05YzFk5zF2Ep0wBruOIOqtqWrc-j0iBx7ed96XxT1iJwMKqWSadgo7s0muhnevPzLtDfk_vxycaEuPi4_vyIP_HJj0ueEHGx_7OAUj0Hb7nVQ_t9xXAJi |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Developing+an+EEG-Based+Emotion+Recognition+Using+Ensemble+Deep+Learning+Methods+and+Fusion+of+Brain+Effective+Connectivity+Maps&rft.jtitle=IEEE+access&rft.au=Bagherzadeh%2C+Sara&rft.au=Shalbaf%2C+Ahmad&rft.au=Shoeibi%2C+Afshin&rft.au=Jafari%2C+Mahboobeh&rft.date=2024&rft.pub=IEEE&rft.eissn=2169-3536&rft.volume=12&rft.spage=50949&rft.epage=50965&rft_id=info:doi/10.1109%2FACCESS.2024.3384303&rft.externalDocID=10488403 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon |