Image-Evoked Emotion Recognition for Hearing-Impaired Subjects with EEG Signals

In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of interest are individuals with hearing impairments, who may have a bias towards certain types of information when communicating with those in thei...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 12; p. 5461
Main Authors Zhu, Mu, Jin, Haonan, Bai, Zhongli, Li, Zhiwei, Song, Yu
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 01.06.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
Abstract In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of interest are individuals with hearing impairments, who may have a bias towards certain types of information when communicating with those in their environment. To address this, our study collected EEG signals from both hearing-impaired and non-hearing-impaired subjects while they viewed pictures of emotional faces for emotion recognition. Four kinds of feature matrices, symmetry difference, and symmetry quotient based on original signal and differential entropy (DE) were constructed, respectively, to extract the spatial domain information. The multi-axis self-attention classification model was proposed, which consists of local attention and global attention, combining the attention model with convolution through a novel architectural element for feature classification. Three-classification (positive, neutral, negative) and five-classification (happy, neutral, sad, angry, fearful) tasks of emotion recognition were carried out. The experimental results show that the proposed method is superior to the original feature method, and the multi-feature fusion achieved a good effect in both hearing-impaired and non-hearing-impaired subjects. The average classification accuracy for hearing-impaired subjects and non-hearing-impaired subjects was 70.2% (three-classification) and 50.15% (five-classification), and 72.05% (three-classification) and 51.53% (five-classification), respectively. In addition, by exploring the brain topography of different emotions, we found that the discriminative brain regions of the hearing-impaired subjects were also distributed in the parietal lobe, unlike those of the non-hearing-impaired subjects.
AbstractList In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of interest are individuals with hearing impairments, who may have a bias towards certain types of information when communicating with those in their environment. To address this, our study collected EEG signals from both hearing-impaired and non-hearing-impaired subjects while they viewed pictures of emotional faces for emotion recognition. Four kinds of feature matrices, symmetry difference, and symmetry quotient based on original signal and differential entropy (DE) were constructed, respectively, to extract the spatial domain information. The multi-axis self-attention classification model was proposed, which consists of local attention and global attention, combining the attention model with convolution through a novel architectural element for feature classification. Three-classification (positive, neutral, negative) and five-classification (happy, neutral, sad, angry, fearful) tasks of emotion recognition were carried out. The experimental results show that the proposed method is superior to the original feature method, and the multi-feature fusion achieved a good effect in both hearing-impaired and non-hearing-impaired subjects. The average classification accuracy for hearing-impaired subjects and non-hearing-impaired subjects was 70.2% (three-classification) and 50.15% (five-classification), and 72.05% (three-classification) and 51.53% (five-classification), respectively. In addition, by exploring the brain topography of different emotions, we found that the discriminative brain regions of the hearing-impaired subjects were also distributed in the parietal lobe, unlike those of the non-hearing-impaired subjects.
In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of interest are individuals with hearing impairments, who may have a bias towards certain types of information when communicating with those in their environment. To address this, our study collected EEG signals from both hearing-impaired and non-hearing-impaired subjects while they viewed pictures of emotional faces for emotion recognition. Four kinds of feature matrices, symmetry difference, and symmetry quotient based on original signal and differential entropy (DE) were constructed, respectively, to extract the spatial domain information. The multi-axis self-attention classification model was proposed, which consists of local attention and global attention, combining the attention model with convolution through a novel architectural element for feature classification. Three-classification (positive, neutral, negative) and five-classification (happy, neutral, sad, angry, fearful) tasks of emotion recognition were carried out. The experimental results show that the proposed method is superior to the original feature method, and the multi-feature fusion achieved a good effect in both hearing-impaired and non-hearing-impaired subjects. The average classification accuracy for hearing-impaired subjects and non-hearing-impaired subjects was 70.2% (three-classification) and 50.15% (five-classification), and 72.05% (three-classification) and 51.53% (five-classification), respectively. In addition, by exploring the brain topography of different emotions, we found that the discriminative brain regions of the hearing-impaired subjects were also distributed in the parietal lobe, unlike those of the non-hearing-impaired subjects.In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of interest are individuals with hearing impairments, who may have a bias towards certain types of information when communicating with those in their environment. To address this, our study collected EEG signals from both hearing-impaired and non-hearing-impaired subjects while they viewed pictures of emotional faces for emotion recognition. Four kinds of feature matrices, symmetry difference, and symmetry quotient based on original signal and differential entropy (DE) were constructed, respectively, to extract the spatial domain information. The multi-axis self-attention classification model was proposed, which consists of local attention and global attention, combining the attention model with convolution through a novel architectural element for feature classification. Three-classification (positive, neutral, negative) and five-classification (happy, neutral, sad, angry, fearful) tasks of emotion recognition were carried out. The experimental results show that the proposed method is superior to the original feature method, and the multi-feature fusion achieved a good effect in both hearing-impaired and non-hearing-impaired subjects. The average classification accuracy for hearing-impaired subjects and non-hearing-impaired subjects was 70.2% (three-classification) and 50.15% (five-classification), and 72.05% (three-classification) and 51.53% (five-classification), respectively. In addition, by exploring the brain topography of different emotions, we found that the discriminative brain regions of the hearing-impaired subjects were also distributed in the parietal lobe, unlike those of the non-hearing-impaired subjects.
Audience Academic
Author Li, Zhiwei
Bai, Zhongli
Zhu, Mu
Jin, Haonan
Song, Yu
AuthorAffiliation Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, School of Electrical Engineering and Automation, Tianjin University of Technology, Tianjin 300384, China; zm78792021@163.com (M.Z.); tianjinjhn1231@hotmail.com (H.J.); zl.bai@hotmail.com (Z.B.)
AuthorAffiliation_xml – name: Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, School of Electrical Engineering and Automation, Tianjin University of Technology, Tianjin 300384, China; zm78792021@163.com (M.Z.); tianjinjhn1231@hotmail.com (H.J.); zl.bai@hotmail.com (Z.B.)
Author_xml – sequence: 1
  givenname: Mu
  surname: Zhu
  fullname: Zhu, Mu
– sequence: 2
  givenname: Haonan
  surname: Jin
  fullname: Jin, Haonan
– sequence: 3
  givenname: Zhongli
  surname: Bai
  fullname: Bai, Zhongli
– sequence: 4
  givenname: Zhiwei
  surname: Li
  fullname: Li, Zhiwei
– sequence: 5
  givenname: Yu
  orcidid: 0000-0002-9295-7795
  surname: Song
  fullname: Song, Yu
BackLink https://www.ncbi.nlm.nih.gov/pubmed/37420628$$D View this record in MEDLINE/PubMed
BookMark eNptkk1r3DAQhkVJaZJtD_0DxdBLe3CiL0vyqYSwTRYCgaY9C1kaO9ra1la2U_rvK2eTTTYEgUaMnnmlkd5jdNCHHhD6SPAJYyU-HSgjtOCCvEFHhFOeK0rxwbP1IToehjXGlDGm3qFDJjnFgqojdL3qTAP58i78BpctuzD60Gc_wIam9_frOsTsEkz0fZOvuo3xMYE3U7UGOw7ZXz_eZsvlRXbjm960w3v0tk4BPjzEBfr1ffnz_DK_ur5YnZ9d5bZgckwzEVWBmTOyYoY5QRQpDWdgrJC8LGtFja0cOGCFdEZhKoqirkVd01Jwh9kCrba6Lpi13kTfmfhPB-P1fSLERps4etuCFpaW3FRcOco5OFwqDCXmSs4TUZC0vm21NlPVgbPQj9G0e6L7O72_1U240wQzTJgsk8KXB4UY_kwwjLrzg4W2NT2EadBUsYJKJdIHLNDnF-g6THF-ukTRUklBuXqiGpM68H0d0sF2FtVnslBcUSVZok5eodJw0HmbPFL7lN8r-PS8012Lj35IwNctYGMYhgj1DiFYz17TO68l9vQFa_1oZs-kW_j2lYr_zS3Rmw
CitedBy_id crossref_primary_10_1016_j_apacoust_2023_109620
crossref_primary_10_31083_j_jin2311210
Cites_doi 10.1109/TAFFC.2018.2817622
10.1007/s12652-018-1065-z
10.1109/TCDS.2016.2587290
10.1016/j.compbiomed.2020.104001
10.1016/j.cmpb.2022.106646
10.1007/978-3-319-70096-0_73
10.1111/psyp.13781
10.1007/s11042-017-4580-6
10.1007/s12559-017-9533-x
10.1109/JBHI.2021.3092412
10.1177/2331216520920079
10.1109/JSEN.2023.3239507
10.1109/NER.2013.6695876
10.1109/CVPR.2017.195
10.1007/s10044-016-0567-6
10.1109/TIM.2021.3121473
10.1109/SPIN.2015.7095376
10.1088/1741-2552/14/1/016009
10.1097/AUD.0000000000000694
10.1109/ACCESS.2021.3049516
10.1109/JBHI.2022.3212475
10.3390/s20072034
10.2478/amns.2021.1.00014
10.1016/j.compbiomed.2022.106344
10.32604/jnm.2020.010674
10.1186/s40708-018-0092-z
10.1109/JSEN.2021.3078087
10.1109/TAMD.2015.2431497
10.1016/j.measurement.2022.111724
ContentType Journal Article
Copyright COPYRIGHT 2023 MDPI AG
2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2023 by the authors. 2023
Copyright_xml – notice: COPYRIGHT 2023 MDPI AG
– notice: 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2023 by the authors. 2023
DBID AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
DOA
DOI 10.3390/s23125461
DatabaseName CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials Local Electronic Collection Information
ProQuest Central
ProQuest One Community College
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
Health & Medical Collection (Alumni Edition)
Medical Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList


Publicly Available Content Database
MEDLINE - Academic
CrossRef
MEDLINE
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 4
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_6c294ab48d244ed0980e90487904818e
PMC10301379
A758482873
37420628
10_3390_s23125461
Genre Journal Article
GeographicLocations China
GeographicLocations_xml – name: China
GrantInformation_xml – fundername: Tianjin University of Technology Graduate Program
  grantid: YJ2226
– fundername: National Natural Science Foundation of China
  grantid: 62103299
– fundername: Tianjin University of Technology Graduate Program
  grantid: YJ 2226
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFKRA
AFZYC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IAO
ITC
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
CGR
CUY
CVF
ECM
EIF
NPM
PMFND
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PJZUB
PKEHL
PPXIY
PQEST
PQUKI
PRINS
7X8
PUEGO
5PM
ID FETCH-LOGICAL-c537t-c516b503da7b3a3d61819a43eac67499f82acbdede357da802655ff6ff2964d03
IEDL.DBID M48
ISSN 1424-8220
IngestDate Wed Aug 27 01:32:49 EDT 2025
Thu Aug 21 18:37:52 EDT 2025
Sun Aug 24 03:22:04 EDT 2025
Fri Jul 25 20:40:15 EDT 2025
Tue Jun 17 21:39:55 EDT 2025
Tue Jun 10 21:27:32 EDT 2025
Thu Apr 03 07:03:45 EDT 2025
Thu Apr 24 23:09:47 EDT 2025
Tue Jul 01 01:20:09 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 12
Keywords self-attention mechanism
EEG signals
emotion faces
emotion classification
hearing-impaired subjects
Language English
License https://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c537t-c516b503da7b3a3d61819a43eac67499f82acbdede357da802655ff6ff2964d03
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
These authors contributed equally to this work.
ORCID 0000-0002-9295-7795
OpenAccessLink https://www.proquest.com/docview/2829876248?pq-origsite=%requestingapplication%
PMID 37420628
PQID 2829876248
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_6c294ab48d244ed0980e90487904818e
pubmedcentral_primary_oai_pubmedcentral_nih_gov_10301379
proquest_miscellaneous_2835278602
proquest_journals_2829876248
gale_infotracmisc_A758482873
gale_infotracacademiconefile_A758482873
pubmed_primary_37420628
crossref_primary_10_3390_s23125461
crossref_citationtrail_10_3390_s23125461
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-06-01
PublicationDateYYYYMMDD 2023-06-01
PublicationDate_xml – month: 06
  year: 2023
  text: 2023-06-01
  day: 01
PublicationDecade 2020
PublicationPlace Switzerland
PublicationPlace_xml – name: Switzerland
– name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationTitleAlternate Sensors (Basel)
PublicationYear 2023
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Hou (ref_2) 2022; 201
Yang (ref_4) 2022; 26
Dorman (ref_5) 2020; 24
Maithri (ref_12) 2022; 215
ref_13
ref_11
Tian (ref_22) 2021; 70
Tian (ref_31) 2022; 27
ref_18
ref_17
Christensen (ref_3) 2019; 40
McFarland (ref_20) 2017; 14
Wang (ref_15) 2020; 2
Zhang (ref_27) 2021; 9
Bai (ref_6) 2023; 152
Li (ref_14) 2018; 10
Gong (ref_24) 2011; 25
Yadava (ref_1) 2017; 76
Wu (ref_26) 2021; 6
Schmidt (ref_23) 2021; 58
Song (ref_19) 2018; 11
Wang (ref_7) 2023; 23
Tandle (ref_21) 2018; 5
ref_29
ref_28
Mert (ref_16) 2018; 21
Mowla (ref_10) 2020; 126
Menezes (ref_9) 2019; 10
Zheng (ref_8) 2017; 9
Yang (ref_30) 2021; 21
Zheng (ref_25) 2015; 7
References_xml – ident: ref_28
– volume: 11
  start-page: 532
  year: 2018
  ident: ref_19
  article-title: EEG emotion recognition using dynamical graph convolutional neural networks
  publication-title: IEEE Trans. Affect. Comput.
  doi: 10.1109/TAFFC.2018.2817622
– volume: 10
  start-page: 3955
  year: 2019
  ident: ref_9
  article-title: Affective recognition from EEG signals: An integrated data-mining approach
  publication-title: J. Ambient Intell. Humaniz. Comput.
  doi: 10.1007/s12652-018-1065-z
– volume: 9
  start-page: 281
  year: 2017
  ident: ref_8
  article-title: Multichannel EEG-Based Emotion Recognition via Group Sparse Canonical Correlation Analysis
  publication-title: IEEE Trans. Cogn. Dev. Syst.
  doi: 10.1109/TCDS.2016.2587290
– volume: 126
  start-page: 104001
  year: 2020
  ident: ref_10
  article-title: Affective brain-computer interfaces: Choosing a meaningful performance measuring metric
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2020.104001
– volume: 215
  start-page: 106646
  year: 2022
  ident: ref_12
  article-title: Automated emotion recognition: Current trends and future perspectives
  publication-title: Comput. Methods Programs Biomed.
  doi: 10.1016/j.cmpb.2022.106646
– ident: ref_11
  doi: 10.1007/978-3-319-70096-0_73
– volume: 58
  start-page: e13781
  year: 2021
  ident: ref_23
  article-title: The human mirror neuron system—A common neural basis for social cognition
  publication-title: Psychophysiology
  doi: 10.1111/psyp.13781
– volume: 25
  start-page: 40
  year: 2011
  ident: ref_24
  article-title: Revision of the Chinese facial affective picture system
  publication-title: Chin. Ment. Health J.
– volume: 76
  start-page: 19087
  year: 2017
  ident: ref_1
  article-title: Analysis of EEG signals and its application to neuromarketing
  publication-title: Multimed. Tools. Appl.
  doi: 10.1007/s11042-017-4580-6
– volume: 10
  start-page: 368
  year: 2018
  ident: ref_14
  article-title: Hierarchical convolutional neural networks for EEG-based emotion recognition
  publication-title: Cogn. Comput.
  doi: 10.1007/s12559-017-9533-x
– volume: 26
  start-page: 589
  year: 2022
  ident: ref_4
  article-title: Investigating of Deaf Emotion Cognition Pattern by EEG and Facial Expression Combination
  publication-title: IEEE J. Biomed. Health Inform.
  doi: 10.1109/JBHI.2021.3092412
– volume: 24
  start-page: 2331216520920079
  year: 2020
  ident: ref_5
  article-title: Approximations to the Voice of a Cochlear Implant: Explorations with Single-Sided Deaf Listeners
  publication-title: Trends Hear.
  doi: 10.1177/2331216520920079
– volume: 23
  start-page: 5165
  year: 2023
  ident: ref_7
  article-title: EEG-Based Emotion Identification Using 1-D Deep Residual Shrinkage Network with Microstate Features
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2023.3239507
– ident: ref_13
  doi: 10.1109/NER.2013.6695876
– ident: ref_29
  doi: 10.1109/CVPR.2017.195
– volume: 21
  start-page: 81
  year: 2018
  ident: ref_16
  article-title: Emotion recognition from EEG signals by using multivariate empirical mode decomposition
  publication-title: Pattern Anal. Appl.
  doi: 10.1007/s10044-016-0567-6
– volume: 70
  start-page: 2516911
  year: 2021
  ident: ref_22
  article-title: EEG-Based Emotion Recognition of Deaf Subjects by Integrated Genetic Firefly Algorithm
  publication-title: IEEE Trans. Instrum. Meas.
  doi: 10.1109/TIM.2021.3121473
– ident: ref_17
  doi: 10.1109/SPIN.2015.7095376
– volume: 14
  start-page: 016009
  year: 2017
  ident: ref_20
  article-title: Prediction of subjective ratings of emotional pictures by EEG features
  publication-title: J. Neural Eng.
  doi: 10.1088/1741-2552/14/1/016009
– volume: 40
  start-page: 1069
  year: 2019
  ident: ref_3
  article-title: Effects of age and hearing loss on the recognition of emotions in speech
  publication-title: Ear Hear.
  doi: 10.1097/AUD.0000000000000694
– volume: 9
  start-page: 7943
  year: 2021
  ident: ref_27
  article-title: Multimodal emotion recognition using a hierarchical fusion convolutional neural network
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2021.3049516
– volume: 27
  start-page: 363
  year: 2022
  ident: ref_31
  article-title: A Novel Domain Adversarial Networks Based on 3D-LSTM and Local Domain Discriminator for Hearing-Impaired Emotion Recognition
  publication-title: IEEE J. Biomed. Health Inform.
  doi: 10.1109/JBHI.2022.3212475
– ident: ref_18
  doi: 10.3390/s20072034
– volume: 6
  start-page: 93
  year: 2021
  ident: ref_26
  article-title: Data processing method of noise logging based on cubic spline interpolation
  publication-title: Appl. Math. Nonlinear Sci.
  doi: 10.2478/amns.2021.1.00014
– volume: 152
  start-page: 106344
  year: 2023
  ident: ref_6
  article-title: Emotion recognition with residual network driven by spatial-frequency characteristics of EEG recorded from hearing-impaired adults in response to video clips
  publication-title: Comput. Biol. Med.
  doi: 10.1016/j.compbiomed.2022.106344
– volume: 2
  start-page: 121
  year: 2020
  ident: ref_15
  article-title: Emotion recognition using WT-SVM in human-computer interaction
  publication-title: J. New Media
  doi: 10.32604/jnm.2020.010674
– volume: 5
  start-page: 14
  year: 2018
  ident: ref_21
  article-title: Mental state and emotion detection from musically stimulated EEG
  publication-title: Brain Inform.
  doi: 10.1186/s40708-018-0092-z
– volume: 21
  start-page: 16894
  year: 2021
  ident: ref_30
  article-title: Facial Expression and EEG Fusion for Investigating Continuous Emotions of Deaf Subjects
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2021.3078087
– volume: 7
  start-page: 162
  year: 2015
  ident: ref_25
  article-title: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks
  publication-title: IEEE Trans. Auton. Ment. Dev.
  doi: 10.1109/TAMD.2015.2431497
– volume: 201
  start-page: 111724
  year: 2022
  ident: ref_2
  article-title: Deep feature pyramid network for EEG emotion recognition
  publication-title: Measurement
  doi: 10.1016/j.measurement.2022.111724
RelatedPersons Song Yu
RelatedPersons_xml – fullname: Song Yu
SSID ssj0023338
Score 2.4010997
Snippet In recent years, there has been a growing interest in the study of emotion recognition through electroencephalogram (EEG) signals. One particular group of...
SourceID doaj
pubmedcentral
proquest
gale
pubmed
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 5461
SubjectTerms Analysis
Artificial intelligence
Brain - physiology
Brain research
Classification
College students
Colleges & universities
Datasets
EEG signals
Eigenvalues
Electroencephalography
Electroencephalography - methods
emotion classification
emotion faces
Emotions
Emotions - physiology
Experiments
Fear
Fourier transforms
Hearing loss
hearing-impaired subjects
Humans
Machine learning
Neural networks
Recognition, Psychology
self-attention mechanism
Song Yu
Support vector machines
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1La9VAFB6kq7oQ36ZWGUXQzaVJ5pllK6m3ggpqobthMjPRYs0V762_3-8kueEGBTduQsicQObkPOabnHyHsRfR6MqH5GkPAAClgs81RQvHs8glZTAm9Gz7797r5bl8e6Eudlp9UU3YQA88KO5Ih7KSvpE24uYU88rmqYLZGToUNlH0Rc7bgqkRagkgr4FHSADUH62xiiHi92KWfXqS_j9D8U4umtdJ7iSe09vs1rhi5MfDk95hN1J3l93c4RG8xz6cfUdYWNS_Vt9S5PXQmYd_3NYG4RxLU76EUUN8cYYIgEAXOYIG7cKsOW3G8rp-wz9dfiE-5fvs_LT-_Hq5GDslLIISZoNjoRuVi-hNI7yIGnm78lIgqmoDTNPa0ocmppiEMtFbAC-l2la3LX11jbl4wPa6VZceMa6CbpoIVXtvpQd4lsA4ChE9KQ3vzzP2aqtBF0YacepmceUAJ0jZblJ2xp5Poj8G7oy_CZ3Qa5gEiO66vwAjcKMRuH8ZQcZe0kt05JR4mODHfwswJaK3csdARZKo_UXGDmeScKYwH96agRudee3oYzMlDWkz9mwapjupQK1Lq2uSwUrWUEOvjD0crGaakjCypF9VM2Zn9jSb83yku_zaU31TE7hCmOrgf2jpMdsv4RpDodsh29v8vE5PsKTaNE977_kNxWUbZA
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB5BucCh4k2gIIOQ4BI1iZ85oYJStkiABFTam-XYTqmgSelu-f3MJNl0IxCXKIonUmzPfPaMJ98AvAxalc5HRzEAdFBKtLk6b9DwDK4lhdfa92z7Hz-pxbH4sJTLMeC2GtMqN5jYA3XoPMXI9-nEjyxXmDfnv1KqGkWnq2MJjetwg6jLKKVLL68cLo7-18AmxNG131_hXobo3_PZGtRT9f8NyFsr0jxbcmv5ObwNu-O-kR0ME30HrsX2LtzaYhO8B5-PzhAc0up39yMGVg31ediXTYYQ3uMGlS1QtVE8PUIcQLgLDKGDYjErRiFZVlXv2dfTE2JVvg_Hh9W3d4t0rJeQesn1Gq-5qmXGg9M1dzwoXL1LJzhiq9Lo2TSmcL4OMUQudXAG3S8pm0Y1DZ29how_gJ22a-MjYNKrug5CROeMcOhCC_R0JOJ6lAoxIEvg9WYErR_JxKmmxU-LTgUNtp0GO4EXk-j5wKDxL6G3NA2TAJFe9w-6ixM72pBVviiFq4UJqEcxZKXJYokIpOmSm5jAK5pES6aJH-Pd-IcBdolIruwB-kaCCP55AnszSTQpP2_eqIEdTXplrxQwgedTM71JaWpt7C5JBvezmsp6JfBw0JqpS1yLgn5YTcDM9GnW53lLe_q9J_ymUnA51-Xj_3_XE7hZoNIPiWx7sLO-uIxPccu0rp_1dvEHTGAUSQ
  priority: 102
  providerName: ProQuest
Title Image-Evoked Emotion Recognition for Hearing-Impaired Subjects with EEG Signals
URI https://www.ncbi.nlm.nih.gov/pubmed/37420628
https://www.proquest.com/docview/2829876248
https://www.proquest.com/docview/2835278602
https://pubmed.ncbi.nlm.nih.gov/PMC10301379
https://doaj.org/article/6c294ab48d244ed0980e90487904818e
Volume 23
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB6V9gIHxLuBsjIICS6BbOzYzgGhFmW7RWpBhZX2Fjm2U6q2WdjdIvj3zOSljai4WFE8kWJ7Hv78-AbglVMyNdYbWgNAgJKizRXjEg1PYyyJrVK2Zts_PpHTmfg0T-Zb0OXYbDtwdSO0o3xSs-Xl298__3xAg39PiBMh-7sVzlGI1h1B0A4GJEX2eSz6zYSY8zqhNd3pCjEeRg3B0PDTQViq2fv_9dEbQWp4gHIjIk3uwd12Ksn2m7G_D1u-egB3NggGH8Lnoyv0F2H2a3HhHcualD3stDs0hM84Z2VT1HYUD4_QNaAHdAy9CS3PrBit0rIsO2Rfz8-owx7BbJJ9-zgN2xQKoU24WmM5lkUScWdUwQ13EgN6agRHdysVgp1Sx8YWzjvPE-WMRkSWJGUpy5K2Y13EH8N2taj8LrDEyqJwQnhjtDCIqgWCnwRdvU8kuoUogDddD-a25RenNBeXOeIM6uy87-wAXvaiPxpSjZuEDmgYegHiwa5fLJZneWtWubRxKkwhtEPV8i5KdeRTdEqKirH2AbymQcxJf_BnrGkvHWCTiPcq30e4JIjznwewN5BEK7PD6k4N8k5Jc9qFpmgidAAv-mr6kk6uVX5xTTI4xVWU6SuAJ43W9E3iSsR0hzUAPdCnQZuHNdX595oDnLLDjblKn_7_t5_B7RgNoDnbtgfb6-W1f46zqHUxgltqrrDUk8MR7BxkJ19OR_WKxKi2nr_BbBuC
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VcgAOiDeBAgaB4BI1iZ04OSBUIGWXPpCglfbmOrZTKiAp3S2IP8VvZCavbgTi1ku0Ws-u4vHMNzP2eAbgqZVJpo3TtAeAAUqGOleEJSpeirYkMlKaptr-zm4y2RfvZ_FsBX73d2EorbLHxAaobW1oj3ydTvxIc0X66vi7T12j6HS1b6HRisWW-_UTQ7b5y-lbXN9nUbSZ772Z-F1XAd_EXC7wGSZFHHCrZcE1twnauEwLjgiUSPT_yzTSprDOOh5Lq1MMUuK4LJOypBNKG3D83wtwEQ1vQBolZ2cBHsd4r61exHkWrM_Rd6Jy8-HI5jWtAf42AEsWcJyduWTuNq_B1c5PZRutYF2HFVfdgCtL1QtvwofpNwQjP_9Rf3GW5W0_IPaxz0jCz-gQswnyDMn9KeIOwqtlCFW09zNntAXM8vwd-3R0SFWcb8H-uXDyNqxWdeXuAotNUhRWCKd1KjSG7AIjqxjtiIsTxJzAgxc9B5XpipdTD42vCoMYYrYamO3Bk4H0uK3Y8S-i17QMAwEV2W6-qE8OVaezKjFRJnQhUoty62yQpYHLEPEkPcLUefCcFlERFODLGN3daMApUVEttYGxmKCGAtyDtRElqrAZD_dioDoImaszgffg8TBMv6S0uMrVp0SD_rOkNmIe3GmlZpgSlyKiC7IepCN5Gs15PFIdfW4KjFPruZDL7N7_3-sRXJrs7Wyr7enu1n24HKECtEl0a7C6ODl1D9BdWxQPGx1hcHDeSvkH0YhRvA
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwED-NISH2gPgmMMAgELxETWI7Th4QGiylZTAQMKlvwbGdbQKasXYg_jX-Ou7ytUYg3vYSVfW1is93v7uzz3cAj6yKU22cpj0ADFBS1LkiLFHxErQlkVHK1NX23-7Gkz3xeiZna_C7uwtDaZUdJtZAbStDe-QjOvEjzRXJqGzTIt5vj58fffepgxSdtHbtNBoR2XG_fmL4tng23ca1fhxF4-zTy4nfdhjwjeRqic8wLmTArVYF19zGaO9SLTiiUawwFiiTSJvCOuu4VFYnGLBIWZZxWdJppQ04_u85OK-4DEnH1Ow02OMY-zWVjDhPg9EC_SgqPR8O7F_dJuBvY7BiDYeZmiumb3wZLrU-K9tqhOwKrLn5VdhYqWR4Dd5NvyEw-dmP6ouzLGt6A7EPXXYSfkbnmE2QZ0juTxGDEGotQ9iifaAFo-1glmWv2MfDfarofB32zoSTN2B9Xs3dLWDSxEVhhXBaJ0Jj-C4wypJoU5yMEX8CD552HMxNW8ic-ml8zTGgIWbnPbM9eNiTHjXVO_5F9IKWoSeggtv1F9Xxft7qbx6bKBW6EIlFGXY2SJPApYh-ih5h4jx4QouYEyzgyxjd3m7AKVGBrXwL4zJBzQW4B5sDSlRnMxzuxCBv4WSRnwq_Bw_6YfolpcjNXXVCNOhLK2op5sHNRmr6KXElIros60EykKfBnIcj88ODutg4taELuUpv__-97sMFVMf8zXR35w5cjFD-m3y6TVhfHp-4u-i5LYt7tYow-HzWOvkHiE9V8g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Image-Evoked+Emotion+Recognition+for+Hearing-Impaired+Subjects+with+EEG+Signals&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Zhu%2C+Mu&rft.au=Jin%2C+Haonan&rft.au=Bai%2C+Zhongli&rft.au=Li%2C+Zhiwei&rft.date=2023-06-01&rft.pub=MDPI+AG&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=23&rft.issue=12&rft_id=info:doi/10.3390%2Fs23125461&rft.externalDocID=A758482873
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon