Emotion Recognition of Subjects with Hearing Impairment Based on Fusion of Facial Expression and EEG Topographic Map
Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairm...
Saved in:
Published in | IEEE transactions on neural systems and rehabilitation engineering Vol. 31; p. 1 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 1534-4320 1558-0210 1558-0210 |
DOI | 10.1109/TNSRE.2022.3225948 |
Cover
Loading…
Abstract | Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, a deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment. |
---|---|
AbstractList | Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, [Formula Omitted] deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment. Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, <tex-math notation="LaTeX">${a}$ </tex-math> deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment. Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, a deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment. Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, a deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment.Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods mainly focus on healthy controls or depression patients. This paper aims to classify the emotional expressions in individuals with hearing impairment based on EEG signals and facial expressions. Two kinds of signals were collected simultaneously when the subjects watched affective video clips, and we labeled the video clips with discrete emotional states (fear, happiness, calmness, and sadness). We extracted the differential entropy (DE) features based on EEG signals and converted DE features into EEG topographic maps (ETM). Next, the ETM and facial expressions were fused by the multichannel fusion method. Finally, a deep learning classifier CBAM_ResNet34 combined Residual Network (ResNet) and Convolutional Block Attention Module (CBAM) was used for subject-dependent emotion classification. The results show that the average classification accuracy of four emotions recognition after multimodal fusion achieves 78.32%, which is higher than 67.90% for facial expressions and 69.43% for EEG signals. Moreover, visualization by the Gradient-weighted Class Activation Mapping (Grad-CAM) of ETM showed that the prefrontal, temporal and occipital lobes were the brain regions closely related to emotional changes in individuals with hearing impairment. |
Author | Mao, Zemin Li, Dahua Hou, Fazheng Liu, Jiayin Song, Haotian Gao, Qiang Yang, Yi Song, Yu |
Author_xml | – sequence: 1 givenname: Dahua orcidid: 0000-0002-1710-3036 surname: Li fullname: Li, Dahua organization: School of Electrical Engineering and Automation, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 2 givenname: Jiayin surname: Liu fullname: Liu, Jiayin organization: School of Electrical Engineering and Automation, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 3 givenname: Yi orcidid: 0000-0001-8679-9359 surname: Yang fullname: Yang, Yi organization: Department of Electrical and Computer Engineering, Faculty of Science and Technology, University of Macau, Macau, China – sequence: 4 givenname: Fazheng orcidid: 0000-0002-9296-4450 surname: Hou fullname: Hou, Fazheng organization: School of Electrical Engineering and Automation, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 5 givenname: Haotian surname: Song fullname: Song, Haotian organization: School of Electrical Engineering and Automation, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 6 givenname: Yu orcidid: 0000-0002-9295-7795 surname: Song fullname: Song, Yu organization: School of Electrical Engineering and Automation, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 7 givenname: Qiang orcidid: 0000-0001-9357-4967 surname: Gao fullname: Gao, Qiang organization: TUT Maritime College, Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, Tianjin University of Technology, Tianjin, China – sequence: 8 givenname: Zemin surname: Mao fullname: Mao, Zemin organization: Technical College for the Deaf, Tianjin University of Technology, Tianjin, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/36455076$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kktv1DAUhSNURB_wB0BCltiwmcHvxEuoMu1IBaR2WFt3HHvqURIHOxH039fzYBZdsLGvrr5z7atzLouzPvS2KN4TPCcEqy-rHw_39ZxiSueMUqF49aq4IEJUM0wJPtvVjM84o_i8uExpizEppSjfFOdMciFwKS-Kse7C6EOP7q0Jm97v6-DQw7TeWjMm9MePj-jWQvT9Bi27AXzsbD-ib5BsgzK8mNJRswDjoUX13yHatG9C36C6vkGrMIRNhOHRG_QdhrfFawdtsu-O91Xxa1Gvrm9ndz9vltdf72aGSzLOSpCmqRx2REhOqnzYBiBvLkqnHGFrzmVDpGAKg6wEVmveOCeAVVLiihB2VSwPc5sAWz1E30F80gG83jdC3GiIozet1QrLsrHYCckEB6qAuVJa5oikpqKsybM-H2YNMfyebBp155OxbQu9DVPStOSSKSqxyuinF-g2TLHPm2aqzKZQSctMfTxS07qzzel7_7zJAD0AJoaUonUnhGC9C4DeB0DvAqCPAcii6oXI-BF2ro4RfPt_6YeD1FtrT28pJSvMFHsGOtK7Jg |
CODEN | ITNSB3 |
CitedBy_id | crossref_primary_10_3390_app14178071 crossref_primary_10_1007_s00521_024_10821_y crossref_primary_10_1038_s41598_024_63651_2 crossref_primary_10_3390_app14072699 crossref_primary_10_12677_jisp_2024_133029 crossref_primary_10_3389_fpsyt_2023_1310323 crossref_primary_10_1051_shsconf_202419401001 crossref_primary_10_1016_j_neucom_2024_128920 crossref_primary_10_3934_math_20231169 crossref_primary_10_1007_s11042_024_20119_9 crossref_primary_10_1016_j_bspc_2024_106957 crossref_primary_10_1109_TAFFC_2024_3418415 crossref_primary_10_1109_TNSRE_2023_3266810 crossref_primary_10_1109_TIM_2024_3400341 crossref_primary_10_1002_jdn_10388 crossref_primary_10_1016_j_eswa_2024_124001 |
Cites_doi | 10.3390/s19204495 10.1109/TCDS.2020.2976112 10.1109/JSEN.2021.3108471 10.1037/h0077714 10.1109/ICICIS.2010.5534805 10.1109/TCDS.2020.2999337 10.1109/TNSRE.2021.3059429 10.1080/02564602.2019.1645620 10.1016/j.neucom.2021.08.018 10.1016/j.jvcir.2019.102659 10.1109/TIM.2021.3121473 10.1109/T-AFFC.2011.15 10.1007/s10723-021-09564-0 10.1016/j.bspc.2022.103687 10.1109/ACCESS.2019.2963113 10.1109/TCYB.2018.2797176 10.1016/j.bspc.2022.103547 10.3390/s19071631 10.1109/TNSRE.2021.3111689 10.1109/JBHI.2021.3092412 10.1109/ACCESS.2019.2949707 10.1109/TAFFC.2020.2994159 10.1109/TNSRE.2022.3185262 10.1109/JBHI.2019.2959843 10.1109/TAMD.2015.2431497 10.1007/978-3-319-46681-1_27 10.1515/bmt-2019-0306 10.1109/CVPR.2016.90 10.3390/brainsci10100687 10.1109/NER.2013.6695876 10.1016/j.imu.2020.100372 10.1007/s11042-018-5909-5 10.1007/978-3-030-01234-2_1 10.3390/s19092212 10.1109/ACCESS.2019.2936124 10.1109/ICSCSE.2016.0051 10.1109/ACCESS.2020.3021994 10.1007/s11760-019-01595-1 10.1007/s12021-022-09579-2 10.1109/GUCON50781.2021.9573860 10.1007/s11042-018-6537-9 10.1007/s11263-019-01228-7 10.1016/j.jneumeth.2003.10.009 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 DOA |
DOI | 10.1109/TNSRE.2022.3225948 |
DatabaseName | IEEE Xplore (IEEE) IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Ceramic Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Nursing & Allied Health Premium Biotechnology and BioEngineering Abstracts MEDLINE - Academic DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Materials Research Database Civil Engineering Abstracts Aluminium Industry Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Ceramic Abstracts Neurosciences Abstracts Materials Business File METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Aerospace Database Nursing & Allied Health Premium Engineered Materials Abstracts Biotechnology Research Abstracts Solid State and Superconductivity Abstracts Engineering Research Database Corrosion Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering MEDLINE - Academic |
DatabaseTitleList | Materials Research Database MEDLINE - Academic MEDLINE |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 4 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Occupational Therapy & Rehabilitation |
EISSN | 1558-0210 |
EndPage | 1 |
ExternalDocumentID | oai_doaj_org_article_9067de0f56354a29a3f76e3f162c823d 36455076 10_1109_TNSRE_2022_3225948 9968039 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: 2021 Tianjin Postgraduate Research and Innovation Project grantid: 2021YJSS092 – fundername: National Natural Science Foundation of China grantid: 62103299 funderid: 10.13039/501100001809 |
GroupedDBID | --- -~X 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAFWJ AAJGR AASAJ AAWTH ABAZT ABVLG ACGFO ACGFS ACIWK ACPRK AENEX AETIX AFPKN AFRAH AGSQL AIBXA ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD ESBDL F5P GROUPED_DOAJ HZ~ H~9 IFIPE IPLJI JAVBF LAI M43 O9- OCL OK1 P2P RIA RIE RNS AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 |
ID | FETCH-LOGICAL-c461t-7a6cd8f0f156418564edaa10957f9f13b446d165390a68509b4dff5a386608113 |
IEDL.DBID | DOA |
ISSN | 1534-4320 1558-0210 |
IngestDate | Wed Aug 27 01:31:31 EDT 2025 Thu Jul 10 20:00:13 EDT 2025 Fri Jul 25 06:46:05 EDT 2025 Thu Apr 03 07:03:17 EDT 2025 Tue Jul 01 00:43:26 EDT 2025 Thu Apr 24 23:03:45 EDT 2025 Wed Aug 27 02:29:12 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c461t-7a6cd8f0f156418564edaa10957f9f13b446d165390a68509b4dff5a386608113 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-9357-4967 0000-0002-1710-3036 0000-0001-8679-9359 0000-0002-9295-7795 0000-0002-9296-4450 0000-0002-8884-7710 |
OpenAccessLink | https://doaj.org/article/9067de0f56354a29a3f76e3f162c823d |
PMID | 36455076 |
PQID | 2771532627 |
PQPubID | 85423 |
PageCount | 1 |
ParticipantIDs | proquest_journals_2771532627 proquest_miscellaneous_2746392609 pubmed_primary_36455076 doaj_primary_oai_doaj_org_article_9067de0f56354a29a3f76e3f162c823d crossref_primary_10_1109_TNSRE_2022_3225948 ieee_primary_9968039 crossref_citationtrail_10_1109_TNSRE_2022_3225948 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-01-01 |
PublicationDateYYYYMMDD | 2023-01-01 |
PublicationDate_xml | – month: 01 year: 2023 text: 2023-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on neural systems and rehabilitation engineering |
PublicationTitleAbbrev | TNSRE |
PublicationTitleAlternate | IEEE Trans Neural Syst Rehabil Eng |
PublicationYear | 2023 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref35 ref12 ref34 ref15 ref37 ref14 ref31 ref30 ref11 ref33 ref10 ref32 King (ref38) 2009; 10 ref2 ref1 ref17 ref39 ref16 ref19 ref18 Morris (ref36) 1995; 35 ref24 ref46 ref23 ref45 ref26 ref25 ref47 ref20 ref42 ref41 ref22 ref44 ref21 ref43 ref28 ref27 ref29 ref8 ref7 ref9 ref4 ref6 ref5 ref40 Ekman (ref3) 1978; 47 |
References_xml | – ident: ref10 doi: 10.3390/s19204495 – ident: ref32 doi: 10.1109/TCDS.2020.2976112 – ident: ref19 doi: 10.1109/JSEN.2021.3108471 – ident: ref4 doi: 10.1037/h0077714 – ident: ref9 doi: 10.1109/ICICIS.2010.5534805 – ident: ref44 doi: 10.1109/TCDS.2020.2999337 – ident: ref2 doi: 10.1109/TNSRE.2021.3059429 – ident: ref24 doi: 10.1080/02564602.2019.1645620 – ident: ref29 doi: 10.1016/j.neucom.2021.08.018 – ident: ref23 doi: 10.1016/j.jvcir.2019.102659 – ident: ref20 doi: 10.1109/TIM.2021.3121473 – ident: ref34 doi: 10.1109/T-AFFC.2011.15 – ident: ref14 doi: 10.1007/s10723-021-09564-0 – ident: ref46 doi: 10.1016/j.bspc.2022.103687 – ident: ref7 doi: 10.1109/ACCESS.2019.2963113 – ident: ref13 doi: 10.1109/TCYB.2018.2797176 – ident: ref43 doi: 10.1016/j.bspc.2022.103547 – ident: ref41 doi: 10.3390/s19071631 – ident: ref33 doi: 10.1109/TNSRE.2021.3111689 – ident: ref18 doi: 10.1109/JBHI.2021.3092412 – ident: ref17 doi: 10.1109/ACCESS.2019.2949707 – ident: ref45 doi: 10.1109/TAFFC.2020.2994159 – ident: ref1 doi: 10.1109/TNSRE.2022.3185262 – ident: ref27 doi: 10.1109/JBHI.2019.2959843 – ident: ref35 doi: 10.1109/TAMD.2015.2431497 – ident: ref30 doi: 10.1007/978-3-319-46681-1_27 – ident: ref8 doi: 10.1515/bmt-2019-0306 – ident: ref40 doi: 10.1109/CVPR.2016.90 – ident: ref12 doi: 10.3390/brainsci10100687 – ident: ref26 doi: 10.1109/NER.2013.6695876 – volume: 10 start-page: 1755 year: 2009 ident: ref38 article-title: Dlib-ml: A machine learning toolkit publication-title: J. Mach. Learn. Res. – ident: ref15 doi: 10.1016/j.imu.2020.100372 – ident: ref25 doi: 10.1007/s11042-018-5909-5 – ident: ref39 doi: 10.1007/978-3-030-01234-2_1 – ident: ref42 doi: 10.3390/s19092212 – ident: ref6 doi: 10.1109/ACCESS.2019.2936124 – ident: ref11 doi: 10.1109/ICSCSE.2016.0051 – ident: ref16 doi: 10.1109/ACCESS.2020.3021994 – ident: ref22 doi: 10.1007/s11760-019-01595-1 – ident: ref31 doi: 10.3390/s19092212 – volume: 47 start-page: 38 issue: 2 year: 1978 ident: ref3 article-title: Facial action coding system (FACS): A technique for the measurement of facial actions publication-title: Rivista Di Psichiatria – ident: ref28 doi: 10.1007/s12021-022-09579-2 – ident: ref5 doi: 10.1109/GUCON50781.2021.9573860 – volume: 35 start-page: 63 issue: 6 year: 1995 ident: ref36 article-title: OBSERVATIONS: SAM: The self-assessment manikin—An efficient cross-cultural measurement of emotional response publication-title: J. Advert. Res. – ident: ref21 doi: 10.1007/s11042-018-6537-9 – ident: ref47 doi: 10.1007/s11263-019-01228-7 – ident: ref37 doi: 10.1016/j.jneumeth.2003.10.009 |
SSID | ssj0017657 |
Score | 2.4682713 |
Snippet | Emotion analysis has been employed in many fields such as human-computer interaction, rehabilitation, and neuroscience. But most emotion analysis methods... |
SourceID | doaj proquest pubmed crossref ieee |
SourceType | Open Website Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 1 |
SubjectTerms | Brain Classification Clips Deep learning EEG Electroencephalogram topographic map Electroencephalography Electroencephalography - methods Emotion recognition Emotional factors Emotions Emotions - physiology Entropy Facial Expression Feature extraction Geologic depressions Hearing Hearing Loss Humans Impairment Individuals with hearing impairment Machine learning Nervous system Occipital lobes Rehabilitation Topographic mapping Topographic maps Topography |
SummonAdditionalLinks | – databaseName: IEEE Electronic Library (IEL) dbid: RIE link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELbanrjwKo-FgowEXCDbxHbs-EhRlgVpe1i2Um-R44dUFSWr7kZC_HpmEicqCBC3aONMnJ0Z-5ux_Q0hr4WyPgQWEqFqlYgi5YlJbZb4oLSsOQQgHs87r87l8kJ8ucwvD8j76SyM977ffObneNmv5bvWdpgqOwVsDsL0ITmEwG04qzWtGCjZs3qCA4tEcJaOB2RSfbo5_7ouIRRkbI7mq7HWz61JqOfqj8VV_o4z-_lmcY-sxp4O20yu592-ntsfv5E4_u-n3Cd3I_CkHwZLeUAOfPOQvLlNMkw3A8MAfUvXv_B3H5N9OVT7oetxvxFct4HCsIN5nB3FdC5dgtfATEg_wxBzdYMdoGcwSToKjRfdLj6zMJilp-X3uAW3oaZxtCw_0U27HQi0ryxdme0jcrEoNx-XSazXkFghs32ijLSuCGlAAhrAAVJ4ZwxoIFdBh4zXoHmXIRduamQBSKUWLoTc8EJKQCYZf0yOmrbxTwl1TNeasVpoqUGbAOICrzOVMpODROtmJBsVWNn4Z2BNjW9VH9SkuuqVXqHSq6j0GXk3PbMdqDz-2foM7WJqiTTc_Q-gxip6daVhrnc-DTnANmGYNjwo6XnIJLMF49DNY1T9JCRqfUZORiur4pCxq5hSYLxMMjUjr6bb4Oy4gmMa33bYRgCihBAURDwZrHOSjevJAO7lsz-_8zm5A5_Hh-zRCTna33T-BeCpff2yd6SfymsZ4Q priority: 102 providerName: IEEE |
Title | Emotion Recognition of Subjects with Hearing Impairment Based on Fusion of Facial Expression and EEG Topographic Map |
URI | https://ieeexplore.ieee.org/document/9968039 https://www.ncbi.nlm.nih.gov/pubmed/36455076 https://www.proquest.com/docview/2771532627 https://www.proquest.com/docview/2746392609 https://doaj.org/article/9067de0f56354a29a3f76e3f162c823d |
Volume | 31 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwELVQT70goAUCbWUk4IJCE9ux42OLsl2Q2sOyFb1FTmyLlarsqpuV-PnMxE60PQAXblFiW9HMs-fD9htC3gvVOu-ZT4VqVCrKjKcma_PUeaVlwyEAcXjf-fpGzm_Ft7vibq_UF54JC_TAQXDnGpZT6zJfgGUUhmnDvZKO-1yytmTc4uoLNm8MpuL-gZKFGq_IZPp8efN9UUEwyNhnBLDGaj97Zmhg64_lVf7saQ4WZ_aMPI2uIr0Iv_icPHHdC_JhnxaYLgMnAP1IF48Yt49IX4X6PHQxnhCC57WnsFBg5mVLf6z6n3QOOAfbRb_CorB6wEQhvQSzZik0nu22sc_MYF6dVr_iodmOms7Sqrqiy_UmUF6vWnptNsfkdlYtv8zTWGEhbYXM-1QZ2drSZx4pY8ByS-GsMSCxQnntc96ArmyO7LWZkSX4Fo2w3heGl1KCL5Hzl-SgW3fuNaGW6UYz1ggtteAM3C7Pm1xlzBQwYmsTko8Cr9soDKyCcV8PYUim60FJNSqpjkpKyKepzyaQb_y19SXqcWqJxNnDC4BTHeFU_wtOCTlCFEyDQDgI-NUJORlRUcdJvq2ZUmAvmGQqIe-mzzA9cc_FdG69wzYCfEAIGmGIVwFN09i4AwzuuHzzP378LTkEYfCQHTohB_3Dzp2Cv9Q3Z8PUOBuuNv4GyNIOtQ |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELZKOcCFV6EsFDAScIFsE9ux4yNFWbbQ3cOSSr1Fjh9SBUpW3URC_HrGiRMVBIhbtHEmzs6M_Xns-QahV0xo6xxxEROViFgW00jFOomsE5JXFBYg1uc7r9Z8ec4-XaQXe-jdlAtjre0Pn9m5v-z38k2jOx8qOwZsDsLkDXQz9cm4Q7bWtGcgeM_rCS7MIkZJPKbIxPK4WH_Z5LAYJGTuDVj6aj_XpqGerT-UV_k70uxnnMVdtBr7Ohw0-Trv2mquf_xG4_i_H3MP3QnQE78fbOU-2rP1A_T6Os0wLgaOAfwGb35h8D5AbT7U-8Gb8cQRXDcOw8DjIzk77AO6eAl-A3MhPoVB5vLKdwCfwDRpMDRedLvwzEL5OD3Ov4dDuDVWtcF5_hEXzXag0L7UeKW2D9H5Ii8-LKNQsSHSjCdtJBTXJnOx8xQ0gAQ4s0Yp0EAqnHQJrUD3JvFsuLHiGWCVihnnUkUzzgGbJPQR2q-b2j5G2BBZSUIqJrkEbQKMc7RKRExUChK1maFkVGCpw5_hq2p8K_tlTSzLXumlV3oZlD5Db6dntgOZxz9bn3i7mFp6Iu7-B1BjGfy6lDDbGxu7FIAbU0Qq6gS31CWc6IxQ6OaBV_0kJGh9ho5GKyvDoLEriRBgvIQTMUMvp9vg7n4PR9W26XwbBpgSFqEg4nCwzkm231EGeM-f_PmdL9CtZbE6K89O15-fotvwqXSIJR2h_faqs88AXbXV896pfgKYAx0p |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Emotion+Recognition+of+Subjects+With+Hearing+Impairment+Based+on+Fusion+of+Facial+Expression+and+EEG+Topographic+Map&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Li%2C+Dahua&rft.au=Liu%2C+Jiayin&rft.au=Yang%2C+Yi&rft.au=Hou%2C+Fazheng&rft.date=2023-01-01&rft.issn=1558-0210&rft.eissn=1558-0210&rft.volume=31&rft.spage=437&rft_id=info:doi/10.1109%2FTNSRE.2022.3225948&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon |