Facial Expression and EEG Fusion for Investigating Continuous Emotions of Deaf Subjects

Emotion recognition has received increasing attention in human-computer interaction (HCI) and psychological assessment. Compared with single modal emotion recognition, the multimodal paradigm has an outperformance because of introducing complementary information for emotion recognition. However, cur...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 21; no. 15; pp. 16894 - 16903
Main Authors Yang, Yi, Gao, Qiang, Song, Xiaolin, Song, Yu, Mao, Zemin, Liu, Junjie
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Emotion recognition has received increasing attention in human-computer interaction (HCI) and psychological assessment. Compared with single modal emotion recognition, the multimodal paradigm has an outperformance because of introducing complementary information for emotion recognition. However, current research is mainly focused on normal people, the deaf subjects need to understand emotional changes in real life. In this paper, we propose a multimodal continuous emotion recognition method for deaf subjects based on facial expression and electroencephalograph (EEG) signals. Twelve emotion movie clips as stimulus were selected and annotated by ten postgraduates who majored in psychology. The EEG signals and facial expressions of deaf subjects were collected when they watched the stimulus clips. The differential entropy (DE) features of EEG were extracted by time-frequency analysis and the six facial features were extracted by facial landmark. Long short-term memory networks (LSTM) were utilized to accomplish the feature level fusion and captured the temporal dynamics of emotions. The result shows that the EEG signal is better than the dynamic emotional capture of facial expressions and deaf subjects in continuous emotion recognition. Multi-modality can compensate for the information of a single modality, which achieves a better performance. Finally, from the neural activities of deaf subjects, the result reveals that the prefrontal lobe region may be strongly related to negative emotion processing, and the lateral temporal lobe region may be strongly related to positive emotion processing.
AbstractList Emotion recognition has received increasing attention in human-computer interaction (HCI) and psychological assessment. Compared with single modal emotion recognition, the multimodal paradigm has an outperformance because of introducing complementary information for emotion recognition. However, current research is mainly focused on normal people, the deaf subjects need to understand emotional changes in real life. In this paper, we propose a multimodal continuous emotion recognition method for deaf subjects based on facial expression and electroencephalograph (EEG) signals. Twelve emotion movie clips as stimulus were selected and annotated by ten postgraduates who majored in psychology. The EEG signals and facial expressions of deaf subjects were collected when they watched the stimulus clips. The differential entropy (DE) features of EEG were extracted by time-frequency analysis and the six facial features were extracted by facial landmark. Long short-term memory networks (LSTM) were utilized to accomplish the feature level fusion and captured the temporal dynamics of emotions. The result shows that the EEG signal is better than the dynamic emotional capture of facial expressions and deaf subjects in continuous emotion recognition. Multi-modality can compensate for the information of a single modality, which achieves a better performance. Finally, from the neural activities of deaf subjects, the result reveals that the prefrontal lobe region may be strongly related to negative emotion processing, and the lateral temporal lobe region may be strongly related to positive emotion processing.
Author Mao, Zemin
Song, Xiaolin
Gao, Qiang
Liu, Junjie
Yang, Yi
Song, Yu
Author_xml – sequence: 1
  givenname: Yi
  orcidid: 0000-0001-8679-9359
  surname: Yang
  fullname: Yang, Yi
  email: yyflying@yeah.net
  organization: Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems & School of Electrical and Electronic Engineering, Tianjin University of Technology, Tianjin, China
– sequence: 2
  givenname: Qiang
  surname: Gao
  fullname: Gao, Qiang
  organization: Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems, TUT Maritime College, Tianjin University of Technology, Tianjin, China
– sequence: 3
  givenname: Xiaolin
  surname: Song
  fullname: Song, Xiaolin
  email: songxiaolin16@hotmail.com
  organization: Engineering Training Center, Tianjin University of Technology, Tianjin, China
– sequence: 4
  givenname: Yu
  orcidid: 0000-0002-9295-7795
  surname: Song
  fullname: Song, Yu
  email: jasonsongrain@hotmail.com
  organization: Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems & School of Electrical and Electronic Engineering, Tianjin University of Technology, Tianjin, China
– sequence: 5
  givenname: Zemin
  surname: Mao
  fullname: Mao, Zemin
  organization: Technical College for the Deaf, Tianjin University of Technology, Tianjin, China
– sequence: 6
  givenname: Junjie
  orcidid: 0000-0002-8827-1141
  surname: Liu
  fullname: Liu, Junjie
  organization: Tianjin Key Laboratory for Control Theory and Applications in Complicated Systems & School of Electrical and Electronic Engineering, Tianjin University of Technology, Tianjin, China
BookMark eNp9UE1LAzEQDaJgW_0B4iXgeWu-drM5St3Wiuihit5CNk1KSpvUZFf037triwcPwsCbgffmzbwhOPbBGwAuMBpjjMT1_aJ6HBNE8JgiXqKSH4EBzvMyw5yVx31PUcYofzsFw5TWCGHBcz4Ar1OlndrA6nMXTUoueKj8ElbVDE7bn9GGCOf-w6TGrVTj_ApOgu-wDW2C1TY0HSnBYOGtURYu2nptdJPOwIlVm2TODzgCL9PqeXKXPTzN5pObh0wTQZtMa5FrVCpLRM5tIVhd15ZwIUyJLDKa6GVNC24RExrVWGDKrRBLo7HRRVd0BK72e3cxvLfdkXId2ug7S0nyvGACI0Y6Ft6zdAwpRWPlLrqtil8SI9nnJ_v8ZJ-fPOTXafgfjXaN6r9tonKbf5WXe6Uzxvw6CUZYgQr6DaaFgAk
CODEN ISJEAZ
CitedBy_id crossref_primary_10_1109_JBHI_2022_3212475
crossref_primary_10_2139_ssrn_4066353
crossref_primary_10_1007_s13042_021_01414_5
crossref_primary_10_1109_JSEN_2024_3440340
crossref_primary_10_3390_s23010498
crossref_primary_10_1016_j_dibe_2023_100198
crossref_primary_10_1109_JSEN_2023_3239507
crossref_primary_10_1109_TIM_2021_3121473
crossref_primary_10_1016_j_measurement_2022_111724
crossref_primary_10_1631_FITEE_2300781
crossref_primary_10_1109_TAFFC_2024_3394436
crossref_primary_10_1109_JSEN_2024_3389674
crossref_primary_10_1016_j_measurement_2022_111648
crossref_primary_10_1109_JSEN_2021_3108471
crossref_primary_10_1080_10447318_2023_2228983
crossref_primary_10_1109_TIM_2024_3400341
crossref_primary_10_3390_s23125461
crossref_primary_10_3390_diagnostics13050977
crossref_primary_10_1002_aisy_202300050
crossref_primary_10_1109_JBHI_2021_3092412
crossref_primary_10_1109_TIM_2023_3347790
Cites_doi 10.1109/TAFFC.2014.2339834
10.1109/TSMCA.2011.2116000
10.1587/transinf.2015EDP7251
10.1109/TAMD.2015.2431497
10.1109/JBHI.2017.2688239
10.5815/ijigsp.2011.05.05
10.1109/JSEN.2020.3027181
10.3233/THC-181538
10.1016/j.imavis.2012.10.002
10.1109/ACCESS.2019.2891579
10.3758/s13428-017-0915-5
10.1016/0005-7916(94)90063-9
10.1016/S0304-3940(98)00122-0
10.1109/TAFFC.2015.2436926
10.1109/TCYB.2017.2788081
10.1109/ACCESS.2019.2927768
10.1155/2017/2107451
10.1126/science.3992243
10.1109/ACCESS.2020.2978163
10.1109/TAFFC.2017.2712143
10.1016/j.jneumeth.2003.10.009
10.1109/T-AFFC.2011.15
10.1016/j.bspc.2020.101867
10.1109/EMBC.2018.8512407
10.1145/2682899
10.1109/ACCESS.2019.2934529
10.1109/TSMC.2017.2756447
10.1109/JSEN.2018.2883497
10.1109/NER.2013.6695876
10.1111/j.1468-1331.2011.03406.x
10.1109/TMM.2018.2887016
10.1109/T-AFFC.2011.25
10.1109/ACCESS.2019.2914872
10.1109/TSMCB.2008.927269
10.1109/TBME.2012.2217495
10.1109/TCYB.2018.2797176
10.1007/s11042-020-09354-y
10.1109/ACCESS.2019.2949707
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
7U5
8FD
L7M
DOI 10.1109/JSEN.2021.3078087
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Solid State and Superconductivity Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList
Solid State and Superconductivity Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Geography
Engineering
Psychology
EISSN 1558-1748
EndPage 16903
ExternalDocumentID 10_1109_JSEN_2021_3078087
9424606
Genre orig-research
GrantInformation_xml – fundername: Fundamental Research on Advanced Technology and Engineering Application Team, Tianjin, China
  grantid: 20160524
– fundername: Natural Science Foundation of Tianjin
  grantid: 18JCYBJC87700
  funderid: 10.13039/501100006606
GroupedDBID -~X
0R~
29I
4.4
5GY
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AGQYO
AHBIQ
AJQPL
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
EBS
F5P
HZ~
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
TWZ
AAYXX
CITATION
7SP
7U5
8FD
L7M
ID FETCH-LOGICAL-c293t-cc95c08af2957f694bbbf2799e80f0ec2cdb367f049c0b19137f99dec1ec6ec63
IEDL.DBID RIE
ISSN 1530-437X
IngestDate Mon Jun 30 10:20:40 EDT 2025
Thu Apr 24 22:54:30 EDT 2025
Tue Jul 01 03:37:03 EDT 2025
Wed Aug 27 02:40:00 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 15
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-cc95c08af2957f694bbbf2799e80f0ec2cdb367f049c0b19137f99dec1ec6ec63
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-9295-7795
0000-0001-8679-9359
0000-0002-8827-1141
PQID 2556491042
PQPubID 75733
PageCount 10
ParticipantIDs ieee_primary_9424606
crossref_primary_10_1109_JSEN_2021_3078087
proquest_journals_2556491042
crossref_citationtrail_10_1109_JSEN_2021_3078087
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-08-01
PublicationDateYYYYMMDD 2021-08-01
PublicationDate_xml – month: 08
  year: 2021
  text: 2021-08-01
  day: 01
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE sensors journal
PublicationTitleAbbrev JSEN
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref34
ref12
ref37
ref15
ref36
ref14
ref31
ref30
ref33
ref11
ref32
ref10
ref2
ref1
ref39
ref17
ref38
ref16
ref19
ref18
ref24
ref23
ekman (ref6) 1978; 47
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
ref8
ref7
ref9
ref4
ref3
ref5
References_xml – ident: ref38
  doi: 10.1109/TAFFC.2014.2339834
– ident: ref10
  doi: 10.1109/TSMCA.2011.2116000
– ident: ref17
  doi: 10.1587/transinf.2015EDP7251
– ident: ref13
  doi: 10.1109/TAMD.2015.2431497
– ident: ref29
  doi: 10.1109/JBHI.2017.2688239
– ident: ref11
  doi: 10.5815/ijigsp.2011.05.05
– ident: ref21
  doi: 10.1109/JSEN.2020.3027181
– ident: ref3
  doi: 10.3233/THC-181538
– ident: ref30
  doi: 10.1016/j.imavis.2012.10.002
– volume: 47
  start-page: 38
  year: 1978
  ident: ref6
  article-title: Facial action coding system (FACS): A technique for the measurement of facial actions
  publication-title: Rivista di Psichiatria
– ident: ref24
  doi: 10.1109/ACCESS.2019.2891579
– ident: ref33
  doi: 10.3758/s13428-017-0915-5
– ident: ref7
  doi: 10.1016/0005-7916(94)90063-9
– ident: ref37
  doi: 10.1016/S0304-3940(98)00122-0
– ident: ref18
  doi: 10.1109/TAFFC.2015.2436926
– ident: ref14
  doi: 10.1109/TCYB.2017.2788081
– ident: ref15
  doi: 10.1109/ACCESS.2019.2927768
– ident: ref32
  doi: 10.1155/2017/2107451
– ident: ref35
  doi: 10.1126/science.3992243
– ident: ref16
  doi: 10.1109/ACCESS.2020.2978163
– ident: ref39
  doi: 10.1109/TAFFC.2017.2712143
– ident: ref34
  doi: 10.1016/j.jneumeth.2003.10.009
– ident: ref27
  doi: 10.1109/T-AFFC.2011.15
– ident: ref20
  doi: 10.1016/j.bspc.2020.101867
– ident: ref31
  doi: 10.1109/EMBC.2018.8512407
– ident: ref22
  doi: 10.1145/2682899
– ident: ref4
  doi: 10.1109/ACCESS.2019.2934529
– ident: ref1
  doi: 10.1109/TSMC.2017.2756447
– ident: ref19
  doi: 10.1109/JSEN.2018.2883497
– ident: ref12
  doi: 10.1109/NER.2013.6695876
– ident: ref8
  doi: 10.1111/j.1468-1331.2011.03406.x
– ident: ref2
  doi: 10.1109/TMM.2018.2887016
– ident: ref28
  doi: 10.1109/T-AFFC.2011.25
– ident: ref26
  doi: 10.1109/ACCESS.2019.2914872
– ident: ref5
  doi: 10.1109/TSMCB.2008.927269
– ident: ref36
  doi: 10.1109/TBME.2012.2217495
– ident: ref23
  doi: 10.1109/TCYB.2018.2797176
– ident: ref9
  doi: 10.1007/s11042-020-09354-y
– ident: ref25
  doi: 10.1109/ACCESS.2019.2949707
SSID ssj0019757
Score 2.4374282
Snippet Emotion recognition has received increasing attention in human-computer interaction (HCI) and psychological assessment. Compared with single modal emotion...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 16894
SubjectTerms Brain modeling
Clips
Continuous emotion recognition
deaf
EEG
Electroencephalography
Emotion recognition
Emotions
facial expression
Feature extraction
Human-computer interface
Motion pictures
Psychology
Sensors
Time-frequency analysis
Title Facial Expression and EEG Fusion for Investigating Continuous Emotions of Deaf Subjects
URI https://ieeexplore.ieee.org/document/9424606
https://www.proquest.com/docview/2556491042
Volume 21
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3Pa9swFH60vSw9rF3S0qw_0GGnUieyrcjWsXROQ6C9rGG5GUt6asuGU5YYlv71lWzFhK2MgQ82SEbwSU_f03v6HsCXyGAcIsqAW3IQMD7CoGC8CELJdSpGSmLoLjjf3fPJjE3no_kOXLV3YRCxTj7DgXutY_l6oSp3VDYULGLc6WvvWsetuavVRgxEUqt62gVMAxYncx_BDKkYTr9l99YTjMKBndApddlzW3tQXVTlL0tcby_jA7jbDKzJKvkxqFZyoF7_0Gz835EfwkfPM8l1MzE-wQ6WXdjfUh_swgdfAP1p3YVOawjXPfg-LtxBOsl--yzZkhSlJll2S8ZV_WmZLtlS6CgfiVO5ei6rRbUkWVMZaEkWhnzFwhBrnNxpz_IIZuPs4WYS-AIMgbIsYBUoZdGiaWEiMUoMF0xKaaJECEypoagipWXME2O9DEWl9fzixAihUYWouH3iY9grFyWeADFUaqYFNxGXLNVcJLKIra3RXBappaB9oBtIcuXVyV2RjJ957aVQkTsUc4di7lHsw2Xb5aWR5vhX455DpW3oAenD2Qb33C_eZe5U2ZilUSz6_H6vU-i4fzd5gGewt_pV4bnlJit5UU_KNwr24E8
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT9swFH5CcAAOYyuglV_zgdO0FCdxnPg4jXQdo70MRG9RbD_DtCmdaCMBfz124kYVm9CkHBLJVix99vP3_J6_B3AaGYxDRBlwSw4CxhMMSsbLIJRcZyJREkN3wXk84aNrdjFNpmvwqbsLg4hN8hkO3GsTy9czVbujsjPBIsadvvaG3feTsL2t1cUMRNroetolTAMWp1MfwwypOLv4kU-sLxiFAzulM-ry51Z2oaasyl-2uNlghjswXg6tzSv5NagXcqCeXqg2_u_Y38IbzzTJ53ZqvIM1rHqwvaI_2INNXwL97rEHW50pfNyFm2HpjtJJ_uDzZCtSVprk-VcyrJtPy3XJikZHdUucztXPqp7Vc5K3tYHmZGbIOZaGWPPkznvme3A9zK--jAJfgiFQlgcsAqUsXjQrTSSS1HDBpJQmSoXAjBqKKlJaxjw11s9QVFrfL06NEBpViIrbJ96H9WpW4XsghkrNtOAm4pJlmotUlrG1NprLMrMktA90CUmhvD65K5Pxu2j8FCoKh2LhUCw8in342HX504pzvNZ416HSNfSA9OFoiXvhl--8cLpszBIpFh38u9cH2BxdjS-Ly2-T74ew5f7TZgUewfrivsZjy1QW8qSZoM_muuOY
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Facial+Expression+and+EEG+Fusion+for+Investigating+Continuous+Emotions+of+Deaf+Subjects&rft.jtitle=IEEE+sensors+journal&rft.au=Yang%2C+Yi&rft.au=Gao%2C+Qiang&rft.au=Song%2C+Xiaolin&rft.au=Song%2C+Yu&rft.date=2021-08-01&rft.pub=IEEE&rft.issn=1530-437X&rft.volume=21&rft.issue=15&rft.spage=16894&rft.epage=16903&rft_id=info:doi/10.1109%2FJSEN.2021.3078087&rft.externalDocID=9424606
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1530-437X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1530-437X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1530-437X&client=summon