MLAR-Net: A Multilevel Attention-Based ResNet Module for the Automated Recognition of Emotions Using Single-Channel EEG Signals
Human emotion recognition is important as it finds applications in multiple domains such as medicine, entertainment, and military. However, accurately identifying emotions remains challenging due to humans' ability to hide or suppress their emotional expressions. Hence it becomes important to r...
Saved in:
Published in | IEEE access Vol. 13; pp. 99122 - 99144 |
---|---|
Main Authors | , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Human emotion recognition is important as it finds applications in multiple domains such as medicine, entertainment, and military. However, accurately identifying emotions remains challenging due to humans' ability to hide or suppress their emotional expressions. Hence it becomes important to recognize emotions by using brain signals as they provide more reliable data. Brain signals can be captured using Electroencephalograms (EEG) electrodes. Most used EEG devices come with multiple channels. However, not all channel information is important for emotion recognition. Another issue with the existing dataset is the availability of a small quantity of samples. To address these challenges, we propose MLAR-Net, a novel multilevel attention module for emotion recognition using single-channel EEG signals. Our approach converts EEG signals into spectrograms using multiple parameters to generate a large set of images. This data is then processed through our proposed MLAR-Net, which integrates a multilevel attention module with ResNet18 architecture. Our study identifies channel number 24 (T7) as the most effective for emotion classification, achieving an average accuracy of 98.06% using a cubic support vector machine and a maximum accuracy of 99.51% using fine K-Nearest Neighbors. The study was conducted using the SEED dataset. It is a publicly available dataset developed by capturing EEG signals from fifteen subjects for three classes of emotions, namely positive, negative, and neutral. The results achieved by the proposed study show an improvement of around 4 to 5% compared to state-of-the-art studies using the same channel. This performance surpasses existing state-of-the-art methods for single-channel EEG-based emotion recognition. Furthermore, we highlight the top-performing channels that can be used for real-time implementation of the system with a minimum number of channels. |
---|---|
AbstractList | Human emotion recognition is important as it finds applications in multiple domains such as medicine, entertainment, and military. However, accurately identifying emotions remains challenging due to humans’ ability to hide or suppress their emotional expressions. Hence it becomes important to recognize emotions by using brain signals as they provide more reliable data. Brain signals can be captured using Electroencephalograms (EEG) electrodes. Most used EEG devices come with multiple channels. However, not all channel information is important for emotion recognition. Another issue with the existing dataset is the availability of a small quantity of samples. To address these challenges, we propose MLAR-Net, a novel multilevel attention module for emotion recognition using single-channel EEG signals. Our approach converts EEG signals into spectrograms using multiple parameters to generate a large set of images. This data is then processed through our proposed MLAR-Net, which integrates a multilevel attention module with ResNet18 architecture. Our study identifies channel number 24 (T7) as the most effective for emotion classification, achieving an average accuracy of 98.06% using a cubic support vector machine and a maximum accuracy of 99.51% using fine K-Nearest Neighbors. The study was conducted using the SEED dataset. It is a publicly available dataset developed by capturing EEG signals from fifteen subjects for three classes of emotions, namely positive, negative, and neutral. The results achieved by the proposed study show an improvement of around 4 to 5% compared to state-of-the-art studies using the same channel. This performance surpasses existing state-of-the-art methods for single-channel EEG-based emotion recognition. Furthermore, we highlight the top-performing channels that can be used for real-time implementation of the system with a minimum number of channels. |
Author | Salvi, Massimo Maithri, M. Molinari, Filippo Sriram, Karthikeyan Kumar Praharaj, Samir Raghavendra, U. Hong Yeong, Chai Gudigar, Anjan Rajendra Acharya, U. |
Author_xml | – sequence: 1 givenname: M. orcidid: 0000-0002-4550-6836 surname: Maithri fullname: Maithri, M. organization: Department of Mechatronics, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, India – sequence: 2 givenname: U. orcidid: 0000-0002-1124-089X surname: Raghavendra fullname: Raghavendra, U. email: raghavendra.u@manipal.edu organization: Department of Instrumentation and Control Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, Karnataka, India – sequence: 3 givenname: Anjan orcidid: 0000-0001-5634-9103 surname: Gudigar fullname: Gudigar, Anjan organization: Department of Instrumentation and Control Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, Karnataka, India – sequence: 4 givenname: Samir surname: Kumar Praharaj fullname: Kumar Praharaj, Samir organization: Department of Psychiatry, Kasturba Medical College, Manipal, Manipal Academy of Higher Education, Manipal, Karnataka, India – sequence: 5 givenname: Karthikeyan surname: Sriram fullname: Sriram, Karthikeyan organization: Department of Electrical and Electronics Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, Karnataka, India – sequence: 6 givenname: Massimo orcidid: 0000-0001-7225-7401 surname: Salvi fullname: Salvi, Massimo organization: Department of Electronics and Telecommunications, Politecnico di Torino, Biolab, PoliToBIOMed Laboratory, Turin, Italy – sequence: 7 givenname: Chai orcidid: 0000-0003-1572-4143 surname: Hong Yeong fullname: Hong Yeong, Chai organization: School of Medicine, Faculty of Health and Medical Sciences, Taylor's University, Subang Jaya, Malaysia – sequence: 8 givenname: Filippo orcidid: 0000-0003-1150-2244 surname: Molinari fullname: Molinari, Filippo organization: Department of Electronics and Telecommunications, Politecnico di Torino, Biolab, PoliToBIOMed Laboratory, Turin, Italy – sequence: 9 givenname: U. surname: Rajendra Acharya fullname: Rajendra Acharya, U. organization: School of Mathematics, Physics and Computing, University of Southern Queensland, Springfield, Australia |
BookMark | eNpNUU1v1DAQjVArUdr-AjhY4pzFH7ETcwtRKJV2qdRtz5YTj7dZZe0SO0ic-Os4TQXMYWb05r1nWe9ddua8gyx7T_CGECw_1U3T7vcbiinfMF4KzOWb7IISIXPGmTj7b3-bXYdwxKmqBPHyIvu929b3-XeIn1GNdvMYhxF-wojqGMHFwbv8iw5g0D2EREI7b-YRkPUTik-A6jn6k44v994f3LAokLeoPfllDegxDO6A9qmNkDdP2rlk3rY3CTo4PYar7NymAdev8zJ7_No-NN_y7d3NbVNv854JEvOCibLA0pKSdlVXlpWQ2FaFYVU6S4sJ4bIQRlDBK02A9n1liSGYYymFMZpdZrerr_H6qJ6n4aSnX8rrQb0AfjooPcWhH0HhzvbcskVfFdCVHTGlkAxzQo3kdvH6uHo9T_7HDCGqo5-n5TeKUVJxwjilicVWVj_5ECawf18lWC3BqTU4tQSnXoNLqg-ragCAfwqCKSmKgv0BiCWUCw |
CODEN | IAECCG |
Cites_doi | 10.1007/s40708-016-0051-5 10.1186/s41747-024-00428-2 10.1109/taffc.2020.3014842 10.1016/j.cmpb.2022.106646 10.3390/app122111255 10.1109/tnnls.2020.3008938 10.1016/j.compbiomed.2019.04.018 10.1109/tamd.2015.2431497 10.1016/j.neucom.2021.03.091 10.3390/diagnostics13111861 10.3390/computers9040095 10.3389/fninf.2023.1081160 10.1016/j.chemolab.2013.03.005 10.1109/tim.2022.3147876 10.3390/app10051619 10.1109/access.2020.3032380 10.1109/tcbb.2023.3247433 10.1109/bdicn58493.2023.00042 10.1109/access.2024.3365570 10.3390/app13116761 10.1109/jiot.2024.3430297 10.4018/jitr.299385 10.1109/ner.2013.6695876 10.1016/j.bspc.2021.102648 10.1007/978-3-030-01234-2_1 10.1109/tsmc.2020.2969686 10.3390/app13116394 10.1016/j.bspc.2021.102979 10.1109/tte.2023.3319157 10.1109/thms.2024.3430327 10.1109/ner.2011.5910636 10.3390/s23187853 10.1016/B978-0-323-85955-4.00004-1 10.69554/nlzl1152 10.1016/S0165-0270(98)00065-X 10.1016/j.dss.2010.12.003 10.3389/fcvm.2024.1424585 10.1109/access.2024.3351003 10.1109/jbhi.2024.3404146 10.1109/tim.2024.3374285 10.3390/electronics10182266 10.1109/taffc.2023.3336531 10.1109/access.2019.2944273 10.1016/j.heliyon.2024.e30174 10.1007/s10470-021-01805-2 10.1016/j.jksuci.2021.08.021 10.3390/app14167165 10.1016/j.chb.2016.08.029 10.1109/jsen.2018.2883497 10.1007/s13042-021-01414-5 10.3390/s23052455 10.1109/access.2019.2953542 10.1609/aimag.v38i3.2741 10.1109/tim.2021.3094619 10.5498/wjp.v13.i1.1 10.1109/cvpr.2018.00745 10.59738/jstr.v5i1.23(17-26).eaqr5800 10.1186/s40537-021-00444-8 10.1016/j.cmpb.2022.106727 10.1007/bf00994018 10.1016/j.cmpb.2016.08.010 10.1145/2939672.2939778 10.1016/j.chaos.2021.110671 10.1016/j.bspc.2023.104783 10.1016/j.engappai.2023.106887 10.1016/j.engappai.2023.106971 10.1145/2594473.2594475 10.3390/s18082739 10.3389/fnins.2022.884475 10.1016/j.bspc.2023.105875 10.1016/j.dibe.2021.100045 10.3390/s16101558 10.1109/ocit59427.2023.10430706 10.1109/cvpr.2016.90 10.1145/3236009 10.1007/s40708-015-0029-8 10.3390/app14020726 10.1016/j.engappai.2024.108305 10.1109/access.2021.3091487 10.3390/app14020702 10.1109/taffc.2020.3025777 10.1109/access.2019.2908285 10.13005/bpj/1928 10.3390/s22093248 10.1109/ACCESS.2022.3224725 10.1109/jbhi.2022.3148109 10.3390/app14062636 10.1016/j.bspc.2023.105312 10.1109/access.2023.3281450 10.1109/access.2019.2944008 10.3390/app121910028 10.1016/j.neulet.2005.09.004 10.1109/sibgrapi51738.2020.00053 10.1109/tcbb.2020.3018137 10.1016/j.caeai.2023.100166 10.1109/NER.2015.7146583 10.1038/s42256-019-0138-9 10.1109/access.2024.3463948 10.1109/access.2023.3322294 10.1109/jsen.2024.3380749 10.1109/taffc.2017.2712143 10.1016/j.bspc.2020.101951 10.1016/j.cmpb.2023.107380 10.1109/access.2021.3051281 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D DOA |
DOI | 10.1109/ACCESS.2025.3576059 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) - NZ CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts METADEX Technology Research Database Materials Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef Materials Research Database Engineered Materials Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace METADEX Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Materials Research Database |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 2169-3536 |
EndPage | 99144 |
ExternalDocumentID | oai_doaj_org_article_0bfc5f3f1d184eb7b1d76930512d95fa 10_1109_ACCESS_2025_3576059 11021444 |
Genre | orig-research |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR ABAZT ABVLG ACGFS ADBBV AGSQL ALMA_UNASSIGNED_HOLDINGS BCNDV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL GROUPED_DOAJ IPLJI JAVBF KQ8 M43 M~E O9- OCL OK1 RIA RIE RNS AAYXX CITATION RIG 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c361t-4367409f172b8b778690f84d383619f0115946d62658a1e2cc8f1d1050996dda3 |
IEDL.DBID | RIE |
ISSN | 2169-3536 |
IngestDate | Wed Aug 27 01:19:55 EDT 2025 Mon Jun 30 07:32:48 EDT 2025 Thu Jul 03 08:40:10 EDT 2025 Wed Aug 27 01:47:38 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c361t-4367409f172b8b778690f84d383619f0115946d62658a1e2cc8f1d1050996dda3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0001-7225-7401 0000-0001-5634-9103 0000-0002-1124-089X 0000-0002-4550-6836 0000-0003-1572-4143 0000-0003-1150-2244 |
OpenAccessLink | https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/11021444 |
PQID | 3218513522 |
PQPubID | 4845423 |
PageCount | 23 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_0bfc5f3f1d184eb7b1d76930512d95fa crossref_primary_10_1109_ACCESS_2025_3576059 proquest_journals_3218513522 ieee_primary_11021444 |
PublicationCentury | 2000 |
PublicationDate | 20250000 2025-00-00 20250101 2025-01-01 |
PublicationDateYYYYMMDD | 2025-01-01 |
PublicationDate_xml | – year: 2025 text: 20250000 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE access |
PublicationTitleAbbrev | Access |
PublicationYear | 2025 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref56 ref59 ref58 ref53 ref52 ref55 ref54 Doshi-Velez (ref91) 2017 ref51 ref50 ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref5 ref100 ref101 ref40 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref32 ref39 ref38 ref24 ref23 ref26 ref25 ref20 ref22 ref21 Klem (ref6) 1999; 52 ref28 ref27 ref29 ref13 ref12 ref15 ref14 ref97 ref96 ref11 Lundberg (ref95) 1984 ref99 ref10 ref98 ref17 ref16 ref19 ref18 ref93 ref92 ref94 ref90 ref89 ref86 ref85 ref88 ref87 ref82 ref81 ref84 ref83 ref80 ref79 ref108 ref78 ref106 ref107 ref75 ref104 ref74 ref105 Santosh (ref60) 2022 ref77 ref102 ref76 ref103 ref2 ref1 ref71 ref70 ref73 ref72 ref68 ref67 ref69 ref64 ref63 ref66 ref65 ref62 ref61 |
References_xml | – ident: ref73 doi: 10.1007/s40708-016-0051-5 – ident: ref78 doi: 10.1186/s41747-024-00428-2 – ident: ref18 doi: 10.1109/taffc.2020.3014842 – ident: ref15 doi: 10.1016/j.cmpb.2022.106646 – ident: ref13 doi: 10.3390/app122111255 – ident: ref37 doi: 10.1109/tnnls.2020.3008938 – ident: ref88 doi: 10.1016/j.compbiomed.2019.04.018 – ident: ref7 doi: 10.1109/tamd.2015.2431497 – ident: ref16 doi: 10.1016/j.neucom.2021.03.091 – ident: ref63 doi: 10.3390/diagnostics13111861 – ident: ref66 doi: 10.3390/computers9040095 – ident: ref50 doi: 10.3389/fninf.2023.1081160 – ident: ref108 doi: 10.1016/j.chemolab.2013.03.005 – ident: ref26 doi: 10.1109/tim.2022.3147876 – ident: ref76 doi: 10.3390/app10051619 – ident: ref68 doi: 10.1109/access.2020.3032380 – ident: ref54 doi: 10.1109/tcbb.2023.3247433 – ident: ref57 doi: 10.1109/bdicn58493.2023.00042 – ident: ref33 doi: 10.1109/access.2024.3365570 – ident: ref35 doi: 10.3390/app13116761 – ident: ref19 doi: 10.1109/jiot.2024.3430297 – ident: ref80 doi: 10.4018/jitr.299385 – ident: ref83 doi: 10.1109/ner.2013.6695876 – ident: ref67 doi: 10.1016/j.bspc.2021.102648 – ident: ref62 doi: 10.1007/978-3-030-01234-2_1 – ident: ref10 doi: 10.1109/tsmc.2020.2969686 – ident: ref82 doi: 10.3390/app13116394 – ident: ref28 doi: 10.1016/j.bspc.2021.102979 – ident: ref85 doi: 10.1109/tte.2023.3319157 – ident: ref17 doi: 10.1109/thms.2024.3430327 – ident: ref8 doi: 10.1109/ner.2011.5910636 – ident: ref5 doi: 10.3390/s23187853 – ident: ref107 doi: 10.1016/B978-0-323-85955-4.00004-1 – ident: ref86 doi: 10.69554/nlzl1152 – ident: ref51 doi: 10.1016/S0165-0270(98)00065-X – ident: ref93 doi: 10.1016/j.dss.2010.12.003 – ident: ref20 doi: 10.3389/fcvm.2024.1424585 – ident: ref64 doi: 10.1109/access.2024.3351003 – ident: ref27 doi: 10.1109/jbhi.2024.3404146 – ident: ref58 doi: 10.1109/tim.2024.3374285 – ident: ref81 doi: 10.3390/electronics10182266 – ident: ref41 doi: 10.1109/taffc.2023.3336531 – ident: ref36 doi: 10.1109/access.2019.2944273 – ident: ref49 doi: 10.1016/j.heliyon.2024.e30174 – ident: ref53 doi: 10.1007/s10470-021-01805-2 – ident: ref105 doi: 10.1016/j.jksuci.2021.08.021 – ident: ref101 doi: 10.3390/app14167165 – ident: ref69 doi: 10.1016/j.chb.2016.08.029 – ident: ref30 doi: 10.1109/jsen.2018.2883497 – ident: ref103 doi: 10.1007/s13042-021-01414-5 – ident: ref12 doi: 10.3390/s23052455 – ident: ref98 doi: 10.1109/access.2019.2953542 – ident: ref90 doi: 10.1609/aimag.v38i3.2741 – ident: ref34 doi: 10.1109/tim.2021.3094619 – ident: ref1 doi: 10.5498/wjp.v13.i1.1 – ident: ref59 doi: 10.1109/cvpr.2018.00745 – year: 2017 ident: ref91 article-title: Towards a rigorous science of interpretable machine learning publication-title: arXiv:1702.08608 – ident: ref99 doi: 10.59738/jstr.v5i1.23(17-26).eaqr5800 – ident: ref61 doi: 10.1186/s40537-021-00444-8 – ident: ref55 doi: 10.1016/j.cmpb.2022.106727 – ident: ref79 doi: 10.1007/bf00994018 – ident: ref65 doi: 10.1016/j.cmpb.2016.08.010 – ident: ref94 doi: 10.1145/2939672.2939778 – ident: ref75 doi: 10.1016/j.chaos.2021.110671 – ident: ref42 doi: 10.1016/j.bspc.2023.104783 – ident: ref84 doi: 10.1016/j.engappai.2023.106887 – ident: ref46 doi: 10.1016/j.engappai.2023.106971 – ident: ref92 doi: 10.1145/2594473.2594475 – ident: ref72 doi: 10.3390/s18082739 – ident: ref40 doi: 10.3389/fnins.2022.884475 – ident: ref47 doi: 10.1016/j.bspc.2023.105875 – ident: ref77 doi: 10.1016/j.dibe.2021.100045 – ident: ref25 doi: 10.3390/s16101558 – ident: ref106 doi: 10.1109/ocit59427.2023.10430706 – ident: ref56 doi: 10.1109/cvpr.2016.90 – ident: ref89 doi: 10.1145/3236009 – ident: ref52 doi: 10.1007/s40708-015-0029-8 – ident: ref14 doi: 10.3390/app14020726 – ident: ref31 doi: 10.1016/j.engappai.2024.108305 – ident: ref104 doi: 10.1109/access.2021.3091487 – ident: ref22 doi: 10.3390/app14020702 – ident: ref3 doi: 10.1109/taffc.2020.3025777 – ident: ref71 doi: 10.1109/access.2019.2908285 – ident: ref9 doi: 10.13005/bpj/1928 – ident: ref39 doi: 10.3390/s22093248 – ident: ref2 doi: 10.1109/ACCESS.2022.3224725 – year: 1984 ident: ref95 article-title: Consistent individualized feature attribution for tree ensembles publication-title: arXiv:1802.03888 – ident: ref11 doi: 10.1109/jbhi.2022.3148109 – ident: ref21 doi: 10.3390/app14062636 – ident: ref45 doi: 10.1016/j.bspc.2023.105312 – ident: ref32 doi: 10.1109/access.2023.3281450 – volume: 52 start-page: 3 year: 1999 ident: ref6 article-title: The ten-twenty electrode system of the international federation publication-title: Electroencephalography Clin. Neurophysiol Suppl. – ident: ref74 doi: 10.1109/access.2019.2944008 – ident: ref24 doi: 10.3390/app121910028 – ident: ref4 doi: 10.1016/j.neulet.2005.09.004 – ident: ref96 doi: 10.1109/sibgrapi51738.2020.00053 – ident: ref29 doi: 10.1109/tcbb.2020.3018137 – ident: ref87 doi: 10.1016/j.caeai.2023.100166 – ident: ref102 doi: 10.1109/NER.2015.7146583 – ident: ref97 doi: 10.1038/s42256-019-0138-9 – ident: ref100 doi: 10.1109/access.2024.3463948 – ident: ref43 doi: 10.1109/access.2023.3322294 – ident: ref48 doi: 10.1109/jsen.2024.3380749 – ident: ref70 doi: 10.1109/taffc.2017.2712143 – volume-title: Deep Learning Models for Medical Imaging year: 2022 ident: ref60 – ident: ref23 doi: 10.1016/j.bspc.2020.101951 – ident: ref44 doi: 10.1016/j.cmpb.2023.107380 – ident: ref38 doi: 10.1109/access.2021.3051281 |
SSID | ssj0000816957 |
Score | 2.333768 |
Snippet | Human emotion recognition is important as it finds applications in multiple domains such as medicine, entertainment, and military. However, accurately... |
SourceID | doaj proquest crossref ieee |
SourceType | Open Website Aggregation Database Index Database Publisher |
StartPage | 99122 |
SubjectTerms | Accuracy attention-based ResNet Availability Biomedical imaging Brain Brain modeling Channels Computational modeling Datasets Electrodes Electroencephalography Emotion recognition Emotions explainable AI Feature extraction Modules Multilevel Real time Real-time systems single-channel EEG Spectrogram Spectrograms Support vector machines |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV09T8MwELUQEwyIjyIKBXlgxDSpHTdmC1WhQpShgMRm9fzBUiWIpjN_nbMToBIDC0sGx4pj38X3XnR-R8i5sybxCQcG3iFBEWrI5j73zPHUAZgBzCEcFJ4-yMmzuHvJXtZKfYWcsEYeuFm4fgLeZJ771CIXcTCE1MbyfRiorMp8hEYY89bIVNyD81SqbNjKDKWJ6hejEc4ICeEgu-QIsqM66Vooior9bYmVX_tyDDY3u2SnRYm0aN5uj2y4cp9sr2kHHpCP6X0xYw-uvqIFjcdoFyH_hxZ13WQwsmsMUJbO3BI70WllVwtHEaJShHy0WNUVYtV4v80gqkpaeTpuyvosacwloI94WTgWziCU-PDx-BabXoPmcoc834yfRhPWVlNghsu0ZoLLIZI5j4gFcgiycSrxubBIUZFE-QANlZAWCU6Wz1M3MCYPix70YZS0ds4PyWZZle6IUOAcjEUjSMeFMQAYB0GC9SIxTqpBl1x8Lax-a0QzdCQbidKNHXSwg27t0CXXYfG_uwbF69iAfqBbP9B_-UGXdILpfsaLNcuF6JLely11-3kuNUdgk6UBex7_x9gnZCvMp_kz0yOb9fvKnSJWqeEsuuUnvZPkiA priority: 102 providerName: Directory of Open Access Journals |
Title | MLAR-Net: A Multilevel Attention-Based ResNet Module for the Automated Recognition of Emotions Using Single-Channel EEG Signals |
URI | https://ieeexplore.ieee.org/document/11021444 https://www.proquest.com/docview/3218513522 https://doaj.org/article/0bfc5f3f1d184eb7b1d76930512d95fa |
Volume | 13 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3BbtQwELVoT3CgBYrYUiofOOJtsnaycW_pakuF2D0UKvVmZewxB1YJYpNLL_11xo63VCAkLlGUOMlEY8fvOTNvGHuPzmY-kyDAIxEUpeei8ZUXKHMEsDNoICQKr9bl1Y36dFvcpmT1mAuDiDH4DKdhN_7Ld50dwlLZWR7rUCu1x_aIuY3JWg8LKqGChC7mSVkoz_RZvVjQSxAHnBVTSbg6CpI-mn2iSH-qqvLXpzjOL5cHbL2zbAwr-T4depjauz9EG__b9EP2PCFNXo9d4wV7gu1L9uyR_uArdr_6XF-LNfbnvOYxFXcTYoh43fdjFKS4oEnO8WvcUiO-6tywQU4wlxNs5PXQd4R34_kUhdS1vPN8OZYG2vIYj8C_0GaDIuQxtHTz5fIjHfoWdJuP2M3l8uviSqSKDMLKMu-FkuWcCKEn1AMVBOk5nflKOaK5RMR8gJdalY5IUlE1Oc6srXzu8qAxo0vnGvma7bddi28YBynBOk2MBaWyFoDmUijBeZVZLPVswj7sPGV-jMIbJhKWTJvRsSY41iTHTthF8OZD06CaHQ-QF0wahCYDbwsvg0mVQphD7mIpSAI9The-mbCj4Lnfz0tOm7CTXecwaYhvjSRwVOQBvx7_47K37GkwcVywOWH7_c8B3xGE6eE0Uv_T2IF_Acl17u4 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwELWgHIADn0UsFPCBI9kmsZONuaWrLQvs7qG0Um9Wxh73wCpBbHLhwl9n7HhLBULiEkWJkzgaO_OeM_OGsbdoTepSAQk4JIIi1SxpXOUSFBkCmBwa8InC6025vJCfLovLmKwecmEQMQSf4dTvhn_5tjODXyo7zkIdailvszvk-It8TNe6XlLxNSRUMYvaQlmqjuv5nF6DWGBeTAUh6yBJesP_BJn-WFflr49x8DCnD9lm37cxsOTrdOhhan78Idv4351_xB5ErMnrcXA8ZrewfcLu31AgfMp-rlf1WbLB_j2veUjG3fooIl73_RgHmZyQm7P8DHfUiK87O2yRE9DlBBx5PfQdId5wPsYhdS3vHF-MxYF2PEQk8C-02WLiMxlauvli8YEOXXnl5kN2cbo4ny-TWJMhMaLM-kSKckaU0BHugQq8-JxKXSUtEV2iYs4DTCVLSzSpqJoMc2Mql9nMq8yo0tpGPGMHbdfic8ZBCDBWEWdBIY0BIG8KJVgnU4Olyifs3d5S-tsovaEDZUmVHg2rvWF1NOyEnXhrXjf1utnhAFlBx2moU3CmcMJ3qZIIM8hsKAZJsMeqwjUTdugt9_t50WgTdrQfHDpO8p0WBI-KzCPYF_-47A27uzxfr_Tq4-bzS3bPd3dcvjliB_33AV8RoOnhdRjGvwDGavFD |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=MLAR-Net%3A+A+Multilevel+Attention-Based+ResNet+Module+for+the+Automated+Recognition+of+Emotions+Using+Single-Channel+EEG+Signals&rft.jtitle=IEEE+access&rft.au=Maithri%2C+M.&rft.au=Raghavendra%2C+U.&rft.au=Gudigar%2C+Anjan&rft.au=Kumar+Praharaj%2C+Samir&rft.date=2025&rft.issn=2169-3536&rft.eissn=2169-3536&rft.volume=13&rft.spage=99122&rft.epage=99144&rft_id=info:doi/10.1109%2FACCESS.2025.3576059&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_ACCESS_2025_3576059 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon |