EmotionMeter: A Multimodal Framework for Recognizing Human Emotions

In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalo...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on cybernetics Vol. 49; no. 3; pp. 1110 - 1122
Main Authors Zheng, Wei-Long, Liu, Wei, Lu, Yifei, Lu, Bao-Liang, Cichocki, Andrzej
Format Journal Article
LanguageEnglish
Published United States IEEE 01.03.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.
AbstractList In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter . The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.
In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.
Author Zheng, Wei-Long
Lu, Yifei
Lu, Bao-Liang
Cichocki, Andrzej
Liu, Wei
Author_xml – sequence: 1
  givenname: Wei-Long
  orcidid: 0000-0002-9474-6369
  surname: Zheng
  fullname: Zheng, Wei-Long
  organization: Department of Computer Science and Engineering, Center for Brain-Like Computing and Machine Intelligence, Shanghai Jiao Tong University, Shanghai, China
– sequence: 2
  givenname: Wei
  orcidid: 0000-0002-3840-1980
  surname: Liu
  fullname: Liu, Wei
  organization: Department of Computer Science and Engineering, Center for Brain-Like Computing and Machine Intelligence, Shanghai Jiao Tong University, Shanghai, China
– sequence: 3
  givenname: Yifei
  orcidid: 0000-0001-9529-7780
  surname: Lu
  fullname: Lu, Yifei
  organization: Department of Computer Science and Engineering, Center for Brain-Like Computing and Machine Intelligence, Shanghai Jiao Tong University, Shanghai, China
– sequence: 4
  givenname: Bao-Liang
  orcidid: 0000-0001-8359-0058
  surname: Lu
  fullname: Lu, Bao-Liang
  email: blu@cs.sjtu.edu.cn
  organization: Department of Computer Science and Engineering, Center for Brain-Like Computing and Machine Intelligence, Shanghai Jiao Tong University, Shanghai, China
– sequence: 5
  givenname: Andrzej
  surname: Cichocki
  fullname: Cichocki, Andrzej
  organization: Nicolaus Copernicus University, Torun, Poland
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29994384$$D View this record in MEDLINE/PubMed
BookMark eNp9kctq3DAYhUVJaaZpHqAUiqGbbmaii23p7y4dcikkBEqyyErI8q9BqW1NJZuQPH00zCSLWUQbCfF9R5fzmRwMYUBCvjK6YIzCye3y_veCU6YWXIJksv5AZpzVas65rA7e1rU8JMcpPdA8VN4C9YkccgAohSpnZHnWh9GH4RpHjL-K0-J66kbfh9Z0xXk0PT6G-K9wIRZ_0YbV4J_9sCoup94MxU5NX8hHZ7qEx7v5iNydn90uL-dXNxd_lqdXc1uyapwzJhXQljJneVNJUKpyroLaNXUDyIQ1UINTrSkZVK60AhvKmWoBlORorDgiP7e56xj-T5hG3ftksevMgGFKmtNaCQH5kIz-2EMfwhSHfDudI8tKKikgU9931NT02Op19L2JT_r1ezIgt4CNIaWITls_ms2jx2h8pxnVmy70pgu96ULvusgm2zNfw99zvm0dj4hvvOJKKFaKF5uakjU
CODEN ITCEB8
CitedBy_id crossref_primary_10_3390_robotics9020044
crossref_primary_10_1007_s12559_022_10016_4
crossref_primary_10_1088_1741_2552_ab260c
crossref_primary_10_1002_smj_3597
crossref_primary_10_3390_s22145252
crossref_primary_10_1145_3583885
crossref_primary_10_1109_TII_2022_3217120
crossref_primary_10_3390_brainsci14040364
crossref_primary_10_1016_j_eswa_2022_118025
crossref_primary_10_1109_LSP_2024_3353679
crossref_primary_10_3390_e23101316
crossref_primary_10_1109_TSMC_2021_3114145
crossref_primary_10_1109_TITS_2023_3240407
crossref_primary_10_1007_s11517_023_02954_4
crossref_primary_10_1109_TAFFC_2022_3181053
crossref_primary_10_1016_j_knosys_2024_112287
crossref_primary_10_3390_brainsci12080987
crossref_primary_10_1038_s41597_024_03838_4
crossref_primary_10_1016_j_bspc_2023_104661
crossref_primary_10_3389_fnins_2024_1458815
crossref_primary_10_1016_j_bspc_2023_105875
crossref_primary_10_1109_JBHI_2023_3315974
crossref_primary_10_1016_j_compbiomed_2021_104589
crossref_primary_10_1109_TCSS_2024_3420445
crossref_primary_10_1109_TIFS_2022_3204222
crossref_primary_10_1109_TCYB_2021_3106638
crossref_primary_10_1109_TAFFC_2020_2994159
crossref_primary_10_1155_2021_9917246
crossref_primary_10_3389_fpsyg_2022_857891
crossref_primary_10_1109_TDSC_2022_3218782
crossref_primary_10_3390_systems10060236
crossref_primary_10_3390_s22145111
crossref_primary_10_1109_TMM_2024_3385676
crossref_primary_10_1109_TIM_2022_3168927
crossref_primary_10_1088_1741_2552_ac63ec
crossref_primary_10_1051_itmconf_20235302011
crossref_primary_10_3389_fpsyg_2022_864047
crossref_primary_10_1109_TAFFC_2023_3336531
crossref_primary_10_1007_s00221_024_06974_w
crossref_primary_10_1016_j_compbiomed_2024_108445
crossref_primary_10_1109_TAFFC_2023_3282704
crossref_primary_10_3389_fnhum_2023_1169949
crossref_primary_10_1109_TAFFC_2024_3399729
crossref_primary_10_3390_e24121830
crossref_primary_10_1016_j_brainresbull_2024_110901
crossref_primary_10_1109_TITS_2023_3345846
crossref_primary_10_1016_j_neunet_2025_107127
crossref_primary_10_2174_1573405619666230309103435
crossref_primary_10_1109_TAFFC_2023_3288118
crossref_primary_10_1016_j_asoc_2022_108889
crossref_primary_10_1109_TCSII_2022_3163141
crossref_primary_10_3390_e24091187
crossref_primary_10_1155_2021_8893795
crossref_primary_10_1109_TCYB_2020_3022647
crossref_primary_10_1109_TAFFC_2024_3395359
crossref_primary_10_1007_s10586_024_04994_3
crossref_primary_10_1016_j_bspc_2023_104998
crossref_primary_10_1088_1741_2552_ad4743
crossref_primary_10_3390_brainsci14040326
crossref_primary_10_1007_s11517_024_03094_z
crossref_primary_10_1016_j_neunet_2023_12_034
crossref_primary_10_34133_cbsystems_0088
crossref_primary_10_1109_ACCESS_2024_3434675
crossref_primary_10_1109_ACCESS_2019_2941251
crossref_primary_10_3389_fnins_2024_1320645
crossref_primary_10_1109_JSEN_2023_3323290
crossref_primary_10_1007_s13042_023_01964_w
crossref_primary_10_1016_j_neucom_2021_03_105
crossref_primary_10_1109_TCYB_2022_3197127
crossref_primary_10_3390_s23187853
crossref_primary_10_1007_s11517_023_02956_2
crossref_primary_10_1088_1741_2552_acbfe0
crossref_primary_10_3389_fnins_2022_872311
crossref_primary_10_3389_fpsyg_2023_1217178
crossref_primary_10_1360_nso_20220023
crossref_primary_10_1109_JBHI_2021_3131186
crossref_primary_10_3390_electronics12143188
crossref_primary_10_1007_s11571_024_10092_2
crossref_primary_10_1109_TCSS_2023_3314508
crossref_primary_10_1016_j_eswa_2023_121889
crossref_primary_10_3389_fphys_2024_1425582
crossref_primary_10_1016_j_compchemeng_2023_108526
crossref_primary_10_1109_JSEN_2021_3059304
crossref_primary_10_1145_3480968
crossref_primary_10_1016_j_aei_2020_101191
crossref_primary_10_1016_j_eswa_2024_126028
crossref_primary_10_3389_fnins_2022_989988
crossref_primary_10_1142_S1469026821500231
crossref_primary_10_1109_JSEN_2024_3398050
crossref_primary_10_3390_brainsci10100687
crossref_primary_10_3389_fphys_2021_823013
crossref_primary_10_1016_j_eswa_2023_121729
crossref_primary_10_7717_peerj_cs_1977
crossref_primary_10_1016_j_jksuci_2023_101648
crossref_primary_10_3389_fpsyg_2023_1141801
crossref_primary_10_1016_j_knosys_2025_113238
crossref_primary_10_1021_acsapm_3c01368
crossref_primary_10_1109_ACCESS_2023_3337424
crossref_primary_10_3389_fnhum_2023_1180533
crossref_primary_10_1109_TIM_2023_3338676
crossref_primary_10_1109_JSEN_2024_3471699
crossref_primary_10_1109_TMC_2023_3283022
crossref_primary_10_1109_JBHI_2024_3418010
crossref_primary_10_1109_TCYB_2020_2987575
crossref_primary_10_1016_j_bspc_2022_104141
crossref_primary_10_1109_JBHI_2023_3239053
crossref_primary_10_3390_electronics13163344
crossref_primary_10_1016_j_bspc_2023_105888
crossref_primary_10_1016_j_bspc_2024_106912
crossref_primary_10_1109_JSEN_2024_3390799
crossref_primary_10_1088_1741_2552_ac41ac
crossref_primary_10_54097_hset_v61i_10293
crossref_primary_10_1016_j_jneumeth_2023_109978
crossref_primary_10_1016_j_heliyon_2024_e36411
crossref_primary_10_34133_cbsystems_0045
crossref_primary_10_1061_JTEPBS_TEENG_7802
crossref_primary_10_1109_TAFFC_2023_3261867
crossref_primary_10_3389_fnhum_2024_1464431
crossref_primary_10_3390_s20040969
crossref_primary_10_1109_TCDS_2020_2976112
crossref_primary_10_1109_TCYB_2021_3073210
crossref_primary_10_1109_JBHI_2023_3335854
crossref_primary_10_1016_j_knosys_2025_113018
crossref_primary_10_1007_s10489_024_06091_9
crossref_primary_10_3233_JCM_226794
crossref_primary_10_1016_j_neunet_2024_106497
crossref_primary_10_1109_TCDS_2024_3357618
crossref_primary_10_3934_mbe_2023505
crossref_primary_10_1109_JBHI_2021_3092412
crossref_primary_10_3390_systems10040111
crossref_primary_10_1109_TNSRE_2022_3175464
crossref_primary_10_1109_TCYB_2020_3033005
crossref_primary_10_3390_app132312832
crossref_primary_10_1016_j_bspc_2021_103291
crossref_primary_10_1109_ACCESS_2023_3329678
crossref_primary_10_1109_ACCESS_2024_3412328
crossref_primary_10_1109_JSEN_2023_3335229
crossref_primary_10_1109_JSEN_2024_3461682
crossref_primary_10_1109_JBHI_2022_3198688
crossref_primary_10_1016_j_bspc_2025_107536
crossref_primary_10_1016_j_neucom_2023_126866
crossref_primary_10_1155_2024_7499554
crossref_primary_10_1016_j_bspc_2024_106189
crossref_primary_10_3389_fphys_2023_1200656
crossref_primary_10_1016_j_bspc_2022_104314
crossref_primary_10_3389_fnins_2023_1219553
crossref_primary_10_9728_dcs_2022_23_2_217
crossref_primary_10_1109_TCYB_2023_3320107
crossref_primary_10_1016_j_procs_2020_06_117
crossref_primary_10_1109_TIM_2023_3277985
crossref_primary_10_3389_fncom_2019_00053
crossref_primary_10_1016_j_artmed_2021_102210
crossref_primary_10_1007_s12652_020_02037_4
crossref_primary_10_1016_j_neunet_2024_106148
crossref_primary_10_1038_s41598_022_07517_5
crossref_primary_10_1016_j_dsim_2025_02_003
crossref_primary_10_1088_1741_2552_abb580
crossref_primary_10_26599_BSA_2020_9050026
crossref_primary_10_1088_1741_2552_ad7060
crossref_primary_10_1109_JSEN_2021_3078087
crossref_primary_10_1109_LSENS_2023_3307111
crossref_primary_10_1016_j_compbiomed_2022_105327
crossref_primary_10_1109_TCDS_2020_2999337
crossref_primary_10_1016_j_neucom_2021_02_048
crossref_primary_10_1016_j_engappai_2025_110004
crossref_primary_10_1109_TNSRE_2023_3245069
crossref_primary_10_1109_ACCESS_2025_3526187
crossref_primary_10_1109_TCYB_2021_3052766
crossref_primary_10_1016_j_neunet_2024_106111
crossref_primary_10_3390_systems10020047
crossref_primary_10_1007_s11571_024_10211_z
crossref_primary_10_1109_ACCESS_2021_3052246
crossref_primary_10_1016_j_bspc_2025_107674
crossref_primary_10_1016_j_bspc_2025_107550
crossref_primary_10_3389_fncom_2019_00080
crossref_primary_10_1109_TCYB_2024_3420958
crossref_primary_10_1016_j_bspc_2024_107019
crossref_primary_10_1109_TIM_2023_3240230
crossref_primary_10_1109_TAFFC_2024_3356511
crossref_primary_10_7717_peerj_cs_2610
crossref_primary_10_1016_j_jksuci_2023_03_014
crossref_primary_10_1109_ACCESS_2023_3266117
crossref_primary_10_1109_ACCESS_2020_3010311
crossref_primary_10_3389_fncom_2021_723843
crossref_primary_10_3389_fpsyg_2021_809459
crossref_primary_10_1007_s11042_024_18259_z
crossref_primary_10_1016_j_eswa_2024_125089
crossref_primary_10_1109_TIM_2024_3369130
crossref_primary_10_3390_s22093248
crossref_primary_10_1109_JSEN_2023_3309260
crossref_primary_10_1111_exsy_13577
crossref_primary_10_3390_s24237856
crossref_primary_10_1016_j_bspc_2022_103660
crossref_primary_10_1109_TMM_2021_3063612
crossref_primary_10_1016_j_aei_2024_102956
crossref_primary_10_3389_fpsyg_2021_771591
crossref_primary_10_1007_s41870_024_02001_x
crossref_primary_10_1007_s11571_024_10123_y
crossref_primary_10_1016_j_eswa_2024_126081
crossref_primary_10_1007_s11571_021_09735_5
crossref_primary_10_1088_1741_2552_ac5c8d
crossref_primary_10_1088_1741_2552_ac6d7d
crossref_primary_10_1016_j_jneumeth_2023_109909
crossref_primary_10_1109_TNNLS_2022_3190448
crossref_primary_10_1109_JSEN_2022_3168572
crossref_primary_10_1007_s11571_023_10004_w
crossref_primary_10_1007_s11517_025_03295_0
crossref_primary_10_1007_s13246_024_01495_w
crossref_primary_10_3390_app13031954
crossref_primary_10_12688_f1000research_144962_3
crossref_primary_10_1109_TCSS_2023_3268505
crossref_primary_10_3390_s20030718
crossref_primary_10_1016_j_dim_2024_100078
crossref_primary_10_12688_f1000research_144962_2
crossref_primary_10_1109_ACCESS_2025_3530567
crossref_primary_10_12688_f1000research_144962_1
crossref_primary_10_1088_1741_2552_ad085a
crossref_primary_10_1109_TNNLS_2022_3194957
crossref_primary_10_1016_j_asoc_2020_106756
crossref_primary_10_1038_s41598_024_82705_z
crossref_primary_10_1038_s41598_025_88248_1
crossref_primary_10_1109_ACCESS_2020_3021994
crossref_primary_10_1109_TAFFC_2024_3357656
crossref_primary_10_2139_ssrn_4158273
crossref_primary_10_1016_j_measurement_2022_111738
crossref_primary_10_1007_s00500_022_06847_w
crossref_primary_10_1007_s11760_022_02248_6
crossref_primary_10_1109_JBHI_2024_3384816
crossref_primary_10_1016_j_ins_2024_121198
crossref_primary_10_1109_JSEN_2023_3330090
crossref_primary_10_1109_TCYB_2020_2987064
crossref_primary_10_1109_TETCI_2024_3406422
crossref_primary_10_1016_j_bspc_2025_107594
crossref_primary_10_1007_s11042_021_11304_1
crossref_primary_10_1109_JTEHM_2023_3320132
crossref_primary_10_1109_TCYB_2021_3085744
crossref_primary_10_1109_TCDS_2018_2868121
crossref_primary_10_1016_j_bspc_2023_104835
crossref_primary_10_1002_aisy_202300359
crossref_primary_10_1016_j_bspc_2021_103029
crossref_primary_10_1016_j_compbiomed_2023_106860
crossref_primary_10_1016_j_heliyon_2024_e31485
crossref_primary_10_1016_j_jneumeth_2025_110360
crossref_primary_10_1007_s00521_022_07643_1
crossref_primary_10_1109_JAS_2022_105515
crossref_primary_10_3390_math12182892
crossref_primary_10_1007_s00521_020_05670_4
crossref_primary_10_1088_1741_2552_acb79e
crossref_primary_10_3233_IDT_230267
crossref_primary_10_1109_TAFFC_2023_3318321
crossref_primary_10_1016_j_inffus_2023_102190
crossref_primary_10_32604_cmc_2022_029073
crossref_primary_10_3390_app13148469
crossref_primary_10_3390_s23052387
crossref_primary_10_3389_fpsyg_2021_705528
crossref_primary_10_1016_j_jneumeth_2024_110276
crossref_primary_10_1038_s41597_023_02248_2
crossref_primary_10_26102_2310_6018_2019_27_4_019
crossref_primary_10_1109_TAFFC_2020_3013711
crossref_primary_10_1109_ACCESS_2022_3201342
crossref_primary_10_3390_brainsci14070688
crossref_primary_10_1007_s11760_024_03360_5
crossref_primary_10_1109_ACCESS_2022_3181887
crossref_primary_10_1109_TAFFC_2024_3392791
crossref_primary_10_1109_TCYB_2022_3169017
crossref_primary_10_3390_s20185083
crossref_primary_10_1109_ACCESS_2024_3430850
crossref_primary_10_3389_fnins_2024_1519970
crossref_primary_10_1016_j_cmpb_2023_107927
crossref_primary_10_1016_j_inffus_2024_102697
crossref_primary_10_1109_TCYB_2020_2984670
crossref_primary_10_3389_fnbot_2024_1481746
crossref_primary_10_3389_fnins_2023_1247082
crossref_primary_10_1007_s11571_023_09968_6
crossref_primary_10_1016_j_neuroimage_2022_119666
crossref_primary_10_1007_s00521_024_10821_y
crossref_primary_10_1142_S0129065722500496
crossref_primary_10_1109_JBHI_2022_3232497
crossref_primary_10_1016_j_bspc_2021_103235
crossref_primary_10_1049_cit2_12174
crossref_primary_10_3389_fnins_2024_1293962
crossref_primary_10_3389_fnins_2021_626277
crossref_primary_10_1109_ACCESS_2024_3460393
crossref_primary_10_1109_JBHI_2024_3416944
crossref_primary_10_3389_fpsyg_2022_899983
crossref_primary_10_1109_TIM_2025_3541801
crossref_primary_10_3389_fnbot_2022_834952
crossref_primary_10_1016_j_neunet_2023_08_031
crossref_primary_10_1016_j_bspc_2024_106204
crossref_primary_10_3390_biomimetics10030178
crossref_primary_10_1007_s11280_021_00865_8
crossref_primary_10_1109_TIM_2025_3548782
crossref_primary_10_3389_fnins_2022_1010951
crossref_primary_10_1109_ACCESS_2021_3072120
crossref_primary_10_1016_j_eswa_2022_116977
crossref_primary_10_1109_TCYB_2021_3069920
crossref_primary_10_1109_JSEN_2021_3120787
crossref_primary_10_1145_3666002
crossref_primary_10_1186_s40708_022_00175_3
crossref_primary_10_3934_mbe_2022369
crossref_primary_10_1016_j_neunet_2024_106643
crossref_primary_10_1038_s41598_024_78542_9
crossref_primary_10_1109_TCDS_2024_3432752
crossref_primary_10_1109_ACCESS_2020_3023871
crossref_primary_10_1016_j_jneumeth_2024_110358
crossref_primary_10_3390_s23031404
crossref_primary_10_1109_TCDS_2021_3086011
crossref_primary_10_1109_TCSS_2023_3298324
crossref_primary_10_5812_ijpbs_153551
crossref_primary_10_1007_s11042_022_13149_8
crossref_primary_10_1142_S0129065720500136
crossref_primary_10_1109_TIE_2022_3150097
crossref_primary_10_1016_j_neunet_2023_01_009
crossref_primary_10_1109_TAFFC_2022_3210441
crossref_primary_10_1109_TBME_2023_3339892
crossref_primary_10_1109_TCYB_2019_2924237
crossref_primary_10_1016_j_eswa_2024_124616
crossref_primary_10_1371_journal_pone_0274203
crossref_primary_10_1109_TAFFC_2024_3371540
crossref_primary_10_1088_1741_2552_ac59a4
crossref_primary_10_1002_brx2_57
crossref_primary_10_1007_s10723_021_09564_0
crossref_primary_10_1109_TCDS_2022_3175538
crossref_primary_10_1109_TIM_2023_3302938
crossref_primary_10_1109_ACCESS_2024_3430934
crossref_primary_10_1007_s10462_023_10690_2
crossref_primary_10_1109_ACCESS_2025_3541176
crossref_primary_10_1016_j_measurement_2024_116046
crossref_primary_10_1016_j_knosys_2024_112770
crossref_primary_10_1016_j_knosys_2021_107982
crossref_primary_10_1016_j_ipm_2022_103196
crossref_primary_10_1016_j_knosys_2021_107867
crossref_primary_10_1109_TIFS_2024_3369405
crossref_primary_10_3390_bios12121097
crossref_primary_10_1016_j_patcog_2019_107085
crossref_primary_10_1109_TAFFC_2024_3394873
crossref_primary_10_1109_TNSRE_2023_3236434
crossref_primary_10_3390_s20236856
crossref_primary_10_1109_ACCESS_2024_3506157
crossref_primary_10_1016_j_jestch_2021_03_012
crossref_primary_10_3389_fnins_2021_778488
crossref_primary_10_3389_fnhum_2023_1132254
crossref_primary_10_1371_journal_pone_0235908
crossref_primary_10_3154_jvs_41_161_21
crossref_primary_10_1007_s00521_022_07843_9
crossref_primary_10_1088_1741_2552_ac49a7
crossref_primary_10_1109_TCSVT_2021_3061719
crossref_primary_10_1016_j_knosys_2023_111011
crossref_primary_10_31083_j_jin2302033
crossref_primary_10_1109_TIM_2022_3204314
crossref_primary_10_3390_soc14090192
crossref_primary_10_1109_JBHI_2022_3210158
crossref_primary_10_1109_TIM_2023_3280529
crossref_primary_10_1109_TCYB_2022_3195739
crossref_primary_10_1016_j_bspc_2024_106249
crossref_primary_10_1016_j_physa_2022_127700
crossref_primary_10_1016_j_bspc_2024_107337
crossref_primary_10_1109_JSEN_2024_3369062
crossref_primary_10_1142_S0218001424570143
crossref_primary_10_1007_s13534_024_00405_z
crossref_primary_10_1109_TAFFC_2022_3189222
crossref_primary_10_1080_10255842_2024_2417212
crossref_primary_10_3389_fnins_2023_1251677
crossref_primary_10_1109_TCDS_2021_3082803
crossref_primary_10_1007_s00521_024_09765_0
crossref_primary_10_3389_fpsyg_2023_1126994
crossref_primary_10_3390_math13020254
crossref_primary_10_1007_s12559_024_10287_z
crossref_primary_10_1109_TCDS_2021_3051465
crossref_primary_10_1109_TIM_2022_3216829
crossref_primary_10_1145_3678503
crossref_primary_10_1016_j_compbiomed_2024_108705
crossref_primary_10_1016_j_bspc_2021_102883
crossref_primary_10_1109_TAFFC_2024_3349770
crossref_primary_10_1109_TCYB_2021_3068300
crossref_primary_10_1088_1741_2552_acae06
crossref_primary_10_1016_j_neubiorev_2021_08_014
crossref_primary_10_11834_jig_230031
crossref_primary_10_1109_TCSII_2023_3318814
crossref_primary_10_1109_TAFFC_2022_3221554
crossref_primary_10_1109_TAFFC_2022_3170428
crossref_primary_10_1109_TAFFC_2024_3433470
crossref_primary_10_1016_j_knosys_2023_111199
crossref_primary_10_1109_JBHI_2024_3419043
crossref_primary_10_1016_j_inffus_2023_102220
crossref_primary_10_1016_j_future_2022_10_034
crossref_primary_10_1088_1742_6596_2576_1_012001
crossref_primary_10_1109_TASLP_2024_3361374
crossref_primary_10_1109_TCDS_2020_3007453
crossref_primary_10_1109_TIM_2024_3398103
crossref_primary_10_3390_brainsci13040685
crossref_primary_10_3389_fnins_2022_865201
crossref_primary_10_3389_fnins_2023_1135850
crossref_primary_10_1109_TNSRE_2024_3457580
crossref_primary_10_1016_j_compbiomed_2023_107457
crossref_primary_10_3390_s22010020
crossref_primary_10_1109_TNSRE_2022_3225948
crossref_primary_10_1109_TCYB_2019_2939399
crossref_primary_10_1016_j_compbiomed_2023_107450
crossref_primary_10_1016_j_inffus_2023_102218
crossref_primary_10_3389_fcomp_2022_780580
crossref_primary_10_1109_TAFFC_2022_3199075
crossref_primary_10_1007_s11571_021_09751_5
crossref_primary_10_1016_j_medengphy_2025_104316
crossref_primary_10_1016_j_eswa_2020_114516
crossref_primary_10_1145_3524499
crossref_primary_10_32604_cmes_2024_056500
crossref_primary_10_1109_TCYB_2022_3198997
crossref_primary_10_3390_make2020007
crossref_primary_10_1109_TAFFC_2024_3394436
crossref_primary_10_1016_j_aei_2022_101601
crossref_primary_10_1016_j_aei_2024_102575
crossref_primary_10_1109_TCYB_2019_2905157
crossref_primary_10_1016_j_neucom_2024_128445
crossref_primary_10_3390_s22186736
crossref_primary_10_1109_TIM_2025_3544334
crossref_primary_10_1109_TCYB_2022_3185119
crossref_primary_10_1109_TCDS_2021_3071170
crossref_primary_10_7717_peerj_cs_2065
crossref_primary_10_3934_Neuroscience_2024006
crossref_primary_10_1016_j_knosys_2024_112669
crossref_primary_10_1145_3654664
crossref_primary_10_1016_j_compbiomed_2024_108857
crossref_primary_10_1088_1741_2552_ace07d
crossref_primary_10_2139_ssrn_4123849
crossref_primary_10_1186_s40708_024_00245_8
crossref_primary_10_1109_LSP_2024_3410044
crossref_primary_10_1016_j_knosys_2022_108819
crossref_primary_10_3390_mti6060047
crossref_primary_10_1109_MIS_2024_3363895
crossref_primary_10_1109_TAFFC_2023_3288885
crossref_primary_10_1016_j_jad_2023_07_008
crossref_primary_10_1016_j_compbiomed_2022_105907
crossref_primary_10_1109_JSEN_2024_3466124
crossref_primary_10_1109_TAFFC_2020_2981440
crossref_primary_10_1109_TCDS_2021_3079712
crossref_primary_10_1109_TCSS_2024_3412074
crossref_primary_10_1155_2020_2909267
crossref_primary_10_1007_s13042_022_01523_9
crossref_primary_10_3389_fnhum_2025_1517273
crossref_primary_10_1080_10255842_2021_2007889
crossref_primary_10_1109_JPROC_2023_3277471
crossref_primary_10_1109_TCSS_2024_3406988
crossref_primary_10_3390_e24121735
crossref_primary_10_3389_fnhum_2023_1280241
crossref_primary_10_1109_TNNLS_2023_3319315
crossref_primary_10_3390_s21041262
crossref_primary_10_1109_TCYB_2020_2977602
crossref_primary_10_1007_s11042_022_13254_8
crossref_primary_10_1109_TBCAS_2020_3008766
crossref_primary_10_22399_ijcesen_829
crossref_primary_10_1109_TSMC_2024_3458949
crossref_primary_10_1109_TIM_2021_3124056
crossref_primary_10_1109_TNSRE_2023_3304660
crossref_primary_10_1016_j_eswa_2023_120948
crossref_primary_10_1088_1742_6596_1950_1_012040
crossref_primary_10_1016_j_neunet_2024_106624
crossref_primary_10_1109_TNNLS_2023_3238519
crossref_primary_10_1109_ACCESS_2020_3000066
crossref_primary_10_1016_j_compeleceng_2022_108332
crossref_primary_10_37394_232014_2021_17_4
crossref_primary_10_3390_s23031255
crossref_primary_10_1109_TSMC_2020_2969686
crossref_primary_10_1109_TIM_2025_3544326
crossref_primary_10_3390_diagnostics13050977
crossref_primary_10_1016_j_eswa_2024_125420
crossref_primary_10_3389_fnins_2021_611653
crossref_primary_10_1186_s40708_021_00141_5
crossref_primary_10_1007_s10489_022_04366_7
crossref_primary_10_1080_27706710_2022_2075241
crossref_primary_10_1007_s41060_025_00715_0
crossref_primary_10_1016_j_oceaneng_2024_117346
crossref_primary_10_1371_journal_pone_0230853
crossref_primary_10_1109_TMM_2021_3096080
crossref_primary_10_1088_1741_2552_ad5761
crossref_primary_10_1016_j_bspc_2021_102741
crossref_primary_10_1007_s41870_024_02400_0
crossref_primary_10_3389_fncom_2024_1416494
crossref_primary_10_1016_j_inffus_2023_102129
crossref_primary_10_1109_JBHI_2023_3307606
crossref_primary_10_3389_fnins_2023_1234162
crossref_primary_10_1109_MSMC_2020_2968638
crossref_primary_10_1109_TIM_2023_3293544
crossref_primary_10_1016_j_eswa_2024_124001
crossref_primary_10_1016_j_asoc_2022_108740
crossref_primary_10_1109_TAFFC_2021_3137857
crossref_primary_10_3390_brainsci11030378
crossref_primary_10_1007_s11760_024_03550_1
crossref_primary_10_1109_TAFFC_2024_3385651
crossref_primary_10_1016_j_bspc_2023_105136
crossref_primary_10_1016_j_isci_2024_111030
crossref_primary_10_1007_s11571_022_09890_3
crossref_primary_10_1016_j_bspc_2024_106646
crossref_primary_10_3389_fpsyg_2023_1333794
crossref_primary_10_3390_brainsci14121166
crossref_primary_10_1016_j_bspc_2023_105013
crossref_primary_10_3390_brainsci14121289
crossref_primary_10_1109_TIM_2024_3428607
crossref_primary_10_1109_TCYB_2021_3090770
crossref_primary_10_1109_TNSRE_2023_3268751
crossref_primary_10_1109_TAFFC_2022_3171782
crossref_primary_10_1016_j_knosys_2024_112599
crossref_primary_10_1109_JSEN_2020_3027181
crossref_primary_10_3389_fnhum_2024_1471634
crossref_primary_10_1016_j_teler_2024_100131
Cites_doi 10.1080/02699930802204677
10.1016/j.ijpsycho.2003.12.001
10.7551/mitpress/1140.001.0001
10.1016/j.neuroimage.2014.11.030
10.1109/TAFFC.2016.2631469
10.1080/2326263X.2014.912881
10.1145/3025453.3025644
10.1109/CW.2010.37
10.1016/S1388-2457(99)00258-8
10.1109/TBME.2010.2048568
10.1016/j.jad.2016.10.021
10.1016/j.biopsycho.2004.03.002
10.1037/h0030377
10.1088/1741-2560/8/2/025008
10.1109/TBME.2014.2347318
10.1038/ncomms13749
10.1109/T-AFFC.2010.7
10.1093/cercor/bhv086
10.1109/TPAMI.2008.52
10.1109/MCI.2015.2501545
10.1145/2504335.2504406
10.1109/SMC.2015.460
10.1109/TPAMI.2010.86
10.1088/1741-2560/13/4/046022
10.1016/j.neuroimage.2014.06.069
10.1109/TAFFC.2015.2396531
10.1109/CVPR.2016.239
10.1109/JSTSP.2017.2764438
10.1177/0956797616687364
10.1109/TKDE.2009.191
10.1016/j.neucli.2017.01.009
10.1109/NER.2011.5910636
10.1145/3025453.3025669
10.1109/NER.2013.6695876
10.1073/pnas.1523980113
10.1145/3025453.3025953
10.1145/3025453.3025599
10.1145/2677199.2680565
10.1145/2682899
10.1177/1754073913512519
10.1016/j.tics.2017.01.001
10.1080/02699930903274322
10.1109/T-AFFC.2010.1
10.1016/j.neucom.2013.06.046
10.1038/nature14539
10.1088/1741-2560/8/3/036015
10.1109/T-AFFC.2011.37
10.1088/1741-2560/11/5/056020
10.1007/978-3-642-15184-2_8
10.1109/TAMD.2015.2431497
10.1111/j.1469-8986.1993.tb03207.x
10.1109/CW.2014.27
10.1037/0022-3514.54.6.1063
10.1080/02699939508408966
10.1109/ACII.2015.7344684
10.1111/j.1469-8986.2008.00654.x
10.1162/089976602760128018
10.1016/j.clinph.2007.06.018
10.1109/ICME.2014.6890166
10.1109/MPRV.2010.49
10.1088/1741-2560/11/4/046018
10.1145/2702123.2702510
10.1109/TAFFC.2014.2339834
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
7TB
8FD
F28
FR3
H8D
JQ2
L7M
L~C
L~D
7X8
DOI 10.1109/TCYB.2018.2797176
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Mechanical & Transportation Engineering Abstracts
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Aerospace Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList Aerospace Database
MEDLINE - Academic
PubMed

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Sciences (General)
EISSN 2168-2275
EndPage 1122
ExternalDocumentID 29994384
10_1109_TCYB_2018_2797176
8283814
Genre orig-research
Journal Article
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61673266
  funderid: 10.13039/501100001809
– fundername: Fundamental Research Funds for the Central Universities
– fundername: Ministry of Education and Science of the Russian Federation
  grantid: 14.756.31.0001
  funderid: 10.13039/501100003443
– fundername: Science and Technology Commission of Shanghai Municipality
  grantid: 15JC1400103
  funderid: 10.13039/501100003399
– fundername: Polish National Science Center
  grantid: 2016/20/W/N24/00354
– fundername: National Basic Research Program of China (973 Program); National Key Research and Development Program of China
  grantid: 2017YFB1002501
  funderid: 10.13039/501100012166
– fundername: Technology Research and Development Program of China Railway Corporation
  grantid: 2016Z003-B
– fundername: ZBYY-MOE Joint Funding
  grantid: 6141A02022604
GroupedDBID 0R~
4.4
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
AENEX
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
HZ~
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
RIG
NPM
7SC
7SP
7TB
8FD
F28
FR3
H8D
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c415t-117890d01fc2b579885ff596fb6b9e13ca969f8da4195f4c3eb0218d99872eac3
IEDL.DBID RIE
ISSN 2168-2267
2168-2275
IngestDate Fri Jul 11 11:11:59 EDT 2025
Mon Jun 30 03:52:46 EDT 2025
Wed Feb 19 02:36:19 EST 2025
Tue Jul 01 00:53:50 EDT 2025
Thu Apr 24 22:57:19 EDT 2025
Wed Aug 27 08:36:58 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c415t-117890d01fc2b579885ff596fb6b9e13ca969f8da4195f4c3eb0218d99872eac3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-9474-6369
0000-0001-8359-0058
0000-0002-3840-1980
0000-0001-9529-7780
PMID 29994384
PQID 2184578739
PQPubID 85422
PageCount 13
ParticipantIDs proquest_miscellaneous_2068339890
pubmed_primary_29994384
ieee_primary_8283814
proquest_journals_2184578739
crossref_citationtrail_10_1109_TCYB_2018_2797176
crossref_primary_10_1109_TCYB_2018_2797176
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2019-03-01
PublicationDateYYYYMMDD 2019-03-01
PublicationDate_xml – month: 03
  year: 2019
  text: 2019-03-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE transactions on cybernetics
PublicationTitleAbbrev TCYB
PublicationTitleAlternate IEEE Trans Cybern
PublicationYear 2019
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
tang (ref48) 2017
ref12
ref59
ref15
ref14
zheng (ref80) 2016
ref52
ref55
ref54
bashivan (ref56) 2015
shi (ref68) 2010
ref17
ref16
ref19
ref18
liu (ref47) 2016
zheng (ref10) 2014
ngiam (ref58) 2011
lu (ref11) 2015
ref46
ref42
ref41
ref44
ref43
alarcao (ref26) 0
ref49
stober (ref57) 2015
ref8
ref7
mcevoy (ref50) 2000; 111
ref9
shi (ref66) 2013
ref4
ref6
ref5
lecun (ref53) 2015; 521
ref81
ref40
koelstra (ref25) 2012
ref79
ref35
duchowski (ref38) 2007; 373
ref78
ref34
ref37
ref36
ref75
ref74
ref30
ref77
ref33
ref76
ref32
ref2
ref39
lin (ref31) 2010; 57
ptaszynski (ref1) 2009
ref70
ref73
ref72
langer (ref45) 2017; 4
ref24
ref67
ref23
zheng (ref51) 0
ref64
ref20
li (ref71) 2009
ref63
fan (ref69) 2008; 9
ref22
picard (ref3) 1997
ref65
ref21
ref28
ref27
ref29
ref60
ref62
ref61
References_xml – volume: 9
  start-page: 1871
  year: 2008
  ident: ref69
  article-title: LIBLINEAR: A library for large linear classification
  publication-title: J Mach Learn Res
– ident: ref2
  doi: 10.1080/02699930802204677
– ident: ref39
  doi: 10.1016/j.ijpsycho.2003.12.001
– year: 1997
  ident: ref3
  publication-title: Affective Computing
  doi: 10.7551/mitpress/1140.001.0001
– ident: ref44
  doi: 10.1016/j.neuroimage.2014.11.030
– ident: ref23
  doi: 10.1109/TAFFC.2016.2631469
– ident: ref14
  doi: 10.1080/2326263X.2014.912881
– ident: ref75
  doi: 10.1145/3025453.3025644
– year: 2012
  ident: ref25
  article-title: Affective and implicit tagging using facial expressions and electroencephalography
– ident: ref36
  doi: 10.1109/CW.2010.37
– volume: 111
  start-page: 457
  year: 2000
  ident: ref50
  article-title: Test-retest reliability of cognitive EEG
  publication-title: Clin Neurophysiol
  doi: 10.1016/S1388-2457(99)00258-8
– volume: 57
  start-page: 1798
  year: 2010
  ident: ref31
  article-title: EEG-based emotion recognition in music listening
  publication-title: IEEE Trans Biomed Eng
  doi: 10.1109/TBME.2010.2048568
– start-page: 689
  year: 2011
  ident: ref58
  article-title: Multimodal deep learning
  publication-title: Proc Int Conf Mach Learn
– year: 0
  ident: ref51
  article-title: Identifying stable patterns over time for emotion recognition from EEG
  publication-title: IEEE Trans Affect Comput
– ident: ref5
  doi: 10.1016/j.jad.2016.10.021
– ident: ref65
  doi: 10.1016/j.biopsycho.2004.03.002
– ident: ref21
  doi: 10.1037/h0030377
– ident: ref34
  doi: 10.1088/1741-2560/8/2/025008
– ident: ref33
  doi: 10.1109/TBME.2014.2347318
– ident: ref81
  doi: 10.1038/ncomms13749
– ident: ref29
  doi: 10.1109/T-AFFC.2010.7
– ident: ref13
  doi: 10.1093/cercor/bhv086
– ident: ref22
  doi: 10.1109/TPAMI.2008.52
– ident: ref79
  doi: 10.1109/MCI.2015.2501545
– start-page: 5040
  year: 2014
  ident: ref10
  article-title: Multimodal emotion recognition using EEG and eye tracking data
  publication-title: Proc 36th Annu Int Conf IEEE Eng Med Biol Soc
– ident: ref17
  doi: 10.1145/2504335.2504406
– ident: ref15
  doi: 10.1109/SMC.2015.460
– ident: ref43
  doi: 10.1109/TPAMI.2010.86
– volume: 4
  year: 2017
  ident: ref45
  article-title: A resource for assessing information processing in the developing brain using EEG and eye tracking
  publication-title: Data Science Journal
– ident: ref28
  doi: 10.1088/1741-2560/13/4/046022
– ident: ref41
  doi: 10.1016/j.neuroimage.2014.06.069
– ident: ref61
  doi: 10.1109/TAFFC.2015.2396531
– ident: ref73
  doi: 10.1109/CVPR.2016.239
– ident: ref9
  doi: 10.1109/JSTSP.2017.2764438
– ident: ref16
  doi: 10.1177/0956797616687364
– ident: ref76
  doi: 10.1109/TKDE.2009.191
– start-page: 1469
  year: 2009
  ident: ref1
  article-title: Towards context aware emotional intelligence in machines: Computing contextual appropriateness of affective states
  publication-title: Proc Int Joint Conf Artif Intell
– ident: ref6
  doi: 10.1016/j.neucli.2017.01.009
– start-page: 811
  year: 2017
  ident: ref48
  article-title: Multimodal emotion recognition using deep neural networks
  publication-title: Proc Int Conf Neural Inf Process
– ident: ref63
  doi: 10.1109/NER.2011.5910636
– ident: ref20
  doi: 10.1145/3025453.3025669
– start-page: 1170
  year: 2015
  ident: ref11
  article-title: Combining eye movements and EEG to enhance emotion recognition
  publication-title: Proc Int Joint Conf Artif Intell
– ident: ref67
  doi: 10.1109/NER.2013.6695876
– ident: ref46
  doi: 10.1073/pnas.1523980113
– ident: ref37
  doi: 10.1145/3025453.3025953
– ident: ref74
  doi: 10.1145/3025453.3025599
– start-page: 2732
  year: 2016
  ident: ref80
  article-title: Personalizing EEG-based affective models with transfer learning
  publication-title: Proc Int Joint Conf Artif Intell
– year: 0
  ident: ref26
  article-title: Emotions recognition using EEG signals: A survey
  publication-title: IEEE Trans Affect Comput
– volume: 373
  year: 2007
  ident: ref38
  publication-title: Eye Tracking Methodology Theory and Practice
– ident: ref18
  doi: 10.1145/2677199.2680565
– ident: ref8
  doi: 10.1145/2682899
– ident: ref12
  doi: 10.1177/1754073913512519
– ident: ref24
  doi: 10.1016/j.tics.2017.01.001
– ident: ref60
  doi: 10.1080/02699930903274322
– year: 2015
  ident: ref56
  article-title: Learning representations from EEG with deep recurrent-convolutional neural networks
  publication-title: arXiv preprint arXiv 1511 05271
– ident: ref4
  doi: 10.1109/T-AFFC.2010.1
– ident: ref32
  doi: 10.1016/j.neucom.2013.06.046
– volume: 521
  start-page: 436
  year: 2015
  ident: ref53
  article-title: Deep learning
  publication-title: Nature
  doi: 10.1038/nature14539
– ident: ref54
  doi: 10.1088/1741-2560/8/3/036015
– start-page: 521
  year: 2016
  ident: ref47
  article-title: Emotion recognition using multimodal deep learning
  publication-title: Proc Int Conf Neural Inf Process
– ident: ref7
  doi: 10.1109/T-AFFC.2011.37
– ident: ref77
  doi: 10.1088/1741-2560/11/5/056020
– start-page: 1223
  year: 2009
  ident: ref71
  article-title: Emotion classification based on gamma-band EEG
  publication-title: Proc Annu Int Conf IEEE Eng Med Biol Soc
– ident: ref72
  doi: 10.1007/978-3-642-15184-2_8
– ident: ref55
  doi: 10.1109/TAMD.2015.2431497
– ident: ref64
  doi: 10.1111/j.1469-8986.1993.tb03207.x
– start-page: 6587
  year: 2010
  ident: ref68
  article-title: Off-line and on-line vigilance estimation based on linear dynamical system and manifold learning
  publication-title: Proc Int Conf IEEE Eng Med Biol Soc
– year: 2015
  ident: ref57
  article-title: Deep feature learning for EEG recordings
  publication-title: arXiv preprint arXiv 1511 05271
– ident: ref52
  doi: 10.1109/CW.2014.27
– ident: ref62
  doi: 10.1037/0022-3514.54.6.1063
– ident: ref59
  doi: 10.1080/02699939508408966
– ident: ref78
  doi: 10.1109/ACII.2015.7344684
– ident: ref40
  doi: 10.1111/j.1469-8986.2008.00654.x
– ident: ref70
  doi: 10.1162/089976602760128018
– ident: ref49
  doi: 10.1016/j.clinph.2007.06.018
– ident: ref30
  doi: 10.1109/ICME.2014.6890166
– start-page: 6627
  year: 2013
  ident: ref66
  article-title: Differential entropy feature for EEG-based vigilance estimation
  publication-title: Proc 35th Annu Int Conf IEEE Eng Med Biol Soc
– ident: ref42
  doi: 10.1109/MPRV.2010.49
– ident: ref35
  doi: 10.1088/1741-2560/11/4/046018
– ident: ref19
  doi: 10.1145/2702123.2702510
– ident: ref27
  doi: 10.1109/TAFFC.2014.2339834
SSID ssj0000816898
Score 2.6662874
Snippet In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1110
SubjectTerms Accuracy
Affective brain-computer interactions
Artificial neural networks
Biological neural networks
Brain
Brain modeling
deep learning
EEG
Electrodes
Electroencephalography
Emotion recognition
Emotions
Eye movements
Fear
Feature extraction
Human computer interaction
multimodal deep neural networks
Neural networks
Title EmotionMeter: A Multimodal Framework for Recognizing Human Emotions
URI https://ieeexplore.ieee.org/document/8283814
https://www.ncbi.nlm.nih.gov/pubmed/29994384
https://www.proquest.com/docview/2184578739
https://www.proquest.com/docview/2068339890
Volume 49
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR3LTtwwcLRw6oVCeTSwVK7EgVZkSWInsbnRFSuEtBwQSPQUxY9cgN2qu3vh65lxnEgginqzFDtxZsbz8LwAjqhMfq61ies6sbFIM-SDkptYWKGFkppbQQnO0-vi8k5c3ef3Azjpc2Gccz74zI1o6H35dm5WdFV2itYBChixBmtouLW5Wv19im8g4VvfZjiIUasogxMzTdTp7fj3L4rjkqOsVGjBUOciZMRKcCleSSTfYuXf2qaXOpPPMO322wabPIxWSz0yz29KOf7vD23CRlA_2XlLL1swcLMvsBUO-IIdhyrUP7ZhfNE2-JlSvMwZO2c-VfdpbnH5pIvoYqjysps2BukZhSDzPgEWli524G5ycTu-jEPHhdigIF_GaUp5sTZJG5PpnEqZ5U2Tq6LRhVYu5aZWhWqkrUWq8kYY7jTpCBZttjJDFs53YX02n7mvwGxtirJspKwzJXSe1I1LSiNzi1wl00UaQdJBvTKhHDl1xXisvFmSqIpwVhHOqoCzCH72S_60tTg-mrxN8O4nBlBHMOxQW4XTuqjIzCXOxVUE3_vHeM7IeVLP3HyFc5JCcq4QPBHstSTRv7ujpP33v3kAn3Bnqo1cG8L68u_KHaIqs9TfPA2_AB2x6wo
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR3LbtQwcFTKAS5AKY9AASNxAES2cWwnNrey6mqBbg9oK5VTFD9yAXYRu3vp1zPjOJFAgLhZip04M-N5eF4AL6hMvrLW5W1b-FzyEvmgFi6XXlpptBVeUoLz4ryaX8gPl-pyD96MuTAhhBh8FiY0jL58v3Y7uio7RusABYy8BtdR7iveZ2uNNyqxhURsflviIEe9ok5uTF6Y4-X08zuK5NKTsjZow1DvImTFRgotf5FJscnK3_XNKHdmt2Ex7LgPN_ky2W3txF39Vszxf3_pDtxKCig76SnmAPbC6i4cpCO-YS9THepXhzA97Vv8LChi5i07YTFZ99va4_LZENPFUOlln_oopCsUgyx6BVhaurkHF7PT5XSep54LuUNRvs05p8xYX_DOlVZRMTPVdcpUna2sCVy41lSm076V3KhOOhEsaQkerba6RCYu7sP-ar0KD4H51lV13WndlkZaVbRdKGqnlUe-UtqKZ1AMUG9cKkhOfTG-NtEwKUxDOGsIZ03CWQavxyXf-2oc_5p8SPAeJyZQZ3A0oLZJ53XTkKFLvEuYDJ6Pj_GkkfukXYX1DucUlRbCIHgyeNCTxPjugZIe_fmbz-DGfLk4a87en398DDdxl6aPYzuC_e2PXXiCis3WPo30_BOK3e5T
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=EmotionMeter%3A+A+Multimodal+Framework+for+Recognizing+Human+Emotions&rft.jtitle=IEEE+transactions+on+cybernetics&rft.au=Wei-Long%2C+Zheng&rft.au=Liu%2C+Wei&rft.au=Lu%2C+Yifei&rft.au=Bao-Liang%2C+Lu&rft.date=2019-03-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=2168-2267&rft.eissn=2168-2275&rft.volume=49&rft.issue=3&rft.spage=1110&rft_id=info:doi/10.1109%2FTCYB.2018.2797176&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2267&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2267&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2267&client=summon