Emotion recognition based on physiological changes in music listening

Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 30; no. 12; pp. 2067 - 2083
Main Authors Jonghwa Kim, Andre, Elisabeth
Format Journal Article
LanguageEnglish
Published Los Alamitos, CA IEEE 01.12.2008
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological data set to a feature-based multiclass classification. In order to collect a physiological data set from multiple subjects over many weeks, we used a musical induction method that spontaneously leads subjects to real emotional states, without any deliberate laboratory setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity, and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, and positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. An improved recognition accuracy of 95 percent and 70 percent for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
AbstractList Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels [abstract truncated by publisher].
Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\% and 70\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological data set to a feature-based multiclass classification. In order to collect a physiological data set from multiple subjects over many weeks, we used a musical induction method that spontaneously leads subjects to real emotional states, without any deliberate laboratory setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity, and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, and positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. An improved recognition accuracy of 95 percent and 70 percent for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\% and 70\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\% and 70\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
In order to collect a physiological data set from multiple subjects over many weeks, we used a musical induction method that spontaneously leads subjects to real emotional states, without any deliberate laboratory setting.
Author Jonghwa Kim
Andre, Elisabeth
Author_xml – sequence: 1
  surname: Jonghwa Kim
  fullname: Jonghwa Kim
  email: kim@informatik.uni-augsburg.de
  organization: Inst. fur Inf., Univ. of Augsburg, Augsburg, Germany
– sequence: 2
  givenname: Elisabeth
  surname: Andre
  fullname: Andre, Elisabeth
  email: andre@informatik.uni-augsburg.de
  organization: Inst. fur Inf., Univ. of Augsburg, Augsburg, Germany
BackLink http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=20841939$$DView record in Pascal Francis
https://www.ncbi.nlm.nih.gov/pubmed/18988943$$D View this record in MEDLINE/PubMed
BookMark eNqF0kFPHCEUAGDSaOq67bGnJs3EpLaX2fKAYeBozGpNbNqDPROGgRUzA9th5uC_L-NubWKinniH7z3gvXeMDkIMFqEPgFcAWH67-XX242pFMBYrwt-gBQGOS0kkOUALDJyUQhBxhI5TusMYWIXpW3QEQgohGV2g9bqPo4-hGKyJm-Af4kYn2xY52N7eJx-7uPFGd4W51WFjU-FD0U_Jm6LzabTBh807dOh0l-z7_blEvy_WN-ffy-ufl1fnZ9elqbgcSwFcCKbBuBawkY1xtHICQGoQNYGWG1nhhtS6BtdSV7mGM9M67AwwUVNCl-jLru52iH8mm0bV-2Rs1-lg45SUxJTnrwue5emLksuaQyWrVyFljFYcRIZfX4TAayAceH7DEp08oXdxGkLujBKckAqAyYw-7dHU9LZV28H3erhX_2aTwec90Cm33w06GJ8eHcGCgaRzoXLnzBBTGqz7XwqreUfUw46oeUcUmVtDn3jjRz0Pfhy0757N-rjL8tbaxxsYY1ATTP8CGxXFSw
CODEN ITPIDJ
CitedBy_id crossref_primary_10_1109_ACCESS_2023_3303311
crossref_primary_10_3390_app15052742
crossref_primary_10_1016_j_neucom_2013_02_041
crossref_primary_10_1142_S1793351X09000744
crossref_primary_10_3389_fpsyg_2022_895929
crossref_primary_10_1016_j_bspc_2014_07_008
crossref_primary_10_1109_TPAMI_2019_2944808
crossref_primary_10_1145_2764921
crossref_primary_10_3389_fnins_2022_884475
crossref_primary_10_1109_JBHI_2019_2893222
crossref_primary_10_1007_s11042_023_14489_9
crossref_primary_10_1007_s12652_021_03367_7
crossref_primary_10_1109_JSEN_2024_3380749
crossref_primary_10_1145_3384394
crossref_primary_10_3390_app15052850
crossref_primary_10_1109_JIOT_2022_3146942
crossref_primary_10_1016_j_bspc_2023_104661
crossref_primary_10_4015_S1016237216500241
crossref_primary_10_1016_j_bspc_2017_05_003
crossref_primary_10_1109_JSEN_2016_2623677
crossref_primary_10_3389_fnins_2019_00437
crossref_primary_10_1109_TAFFC_2017_2712143
crossref_primary_10_1007_s11760_017_1092_9
crossref_primary_10_1145_3699724
crossref_primary_10_3390_s21165554
crossref_primary_10_1016_j_procs_2024_04_124
crossref_primary_10_1145_3490686
crossref_primary_10_1109_ACCESS_2019_2940378
crossref_primary_10_1007_s13042_016_0601_4
crossref_primary_10_1016_j_ijhcs_2018_10_003
crossref_primary_10_1080_21681015_2017_1324528
crossref_primary_10_3390_s19204495
crossref_primary_10_1109_TAFFC_2016_2535291
crossref_primary_10_1016_j_entcs_2019_04_009
crossref_primary_10_1527_tjsai_40_2_B_O72
crossref_primary_10_1016_j_irbm_2024_100849
crossref_primary_10_1016_j_bspc_2023_104890
crossref_primary_10_1016_j_bspc_2023_105744
crossref_primary_10_1002_ecj_11627
crossref_primary_10_1016_j_trf_2020_09_005
crossref_primary_10_26634_jpr_7_2_18127
crossref_primary_10_3390_bios8020030
crossref_primary_10_3389_fpsyg_2022_1066317
crossref_primary_10_3390_s18072074
crossref_primary_10_1109_TAFFC_2022_3163609
crossref_primary_10_1109_TCE_2013_6689687
crossref_primary_10_1109_TAFFC_2017_2723011
crossref_primary_10_1109_ACCESS_2022_3186318
crossref_primary_10_1007_s00530_013_0303_7
crossref_primary_10_26637_MJM0804_0123
crossref_primary_10_1109_TNSRE_2023_3253866
crossref_primary_10_3109_17518423_2013_834997
crossref_primary_10_3389_fnbot_2017_00019
crossref_primary_10_1109_TNSRE_2010_2078516
crossref_primary_10_1016_j_future_2021_01_010
crossref_primary_10_1016_j_bspc_2013_06_014
crossref_primary_10_1016_j_bspc_2018_08_024
crossref_primary_10_1109_TAFFC_2014_2327617
crossref_primary_10_1007_s10772_012_9175_z
crossref_primary_10_12688_wellcomeopenres_16856_1
crossref_primary_10_1016_j_inffus_2022_03_009
crossref_primary_10_12688_wellcomeopenres_16856_2
crossref_primary_10_1038_s41597_024_03729_8
crossref_primary_10_12688_wellcomeopenres_16856_3
crossref_primary_10_1109_TAFFC_2022_3173403
crossref_primary_10_1007_s00521_021_06516_3
crossref_primary_10_1016_j_inffus_2020_08_007
crossref_primary_10_3390_s22020547
crossref_primary_10_3390_mti6050035
crossref_primary_10_3390_app9163355
crossref_primary_10_1007_s11334_022_00520_z
crossref_primary_10_1109_TAFFC_2018_2820049
crossref_primary_10_1007_s12652_018_1065_z
crossref_primary_10_1016_j_bbe_2017_05_004
crossref_primary_10_1007_s12193_019_00302_1
crossref_primary_10_1016_j_bj_2017_11_001
crossref_primary_10_5370_KIEE_2013_62_11_1598
crossref_primary_10_1109_TAFFC_2015_2392932
crossref_primary_10_1109_TCE_2011_6131173
crossref_primary_10_1016_j_bandc_2015_08_003
crossref_primary_10_1109_TAFFC_2019_2892090
crossref_primary_10_3390_s24248130
crossref_primary_10_1016_j_patcog_2024_111249
crossref_primary_10_1109_TNNLS_2013_2280271
crossref_primary_10_3390_aerospace11010030
crossref_primary_10_1016_j_gaitpost_2014_06_016
crossref_primary_10_3390_brainsci10100687
crossref_primary_10_3390_app10207239
crossref_primary_10_3390_s20144037
crossref_primary_10_1007_s10803_012_1587_8
crossref_primary_10_1007_s11760_021_01942_1
crossref_primary_10_1016_j_neuroscience_2020_01_045
crossref_primary_10_1007_s11042_015_3202_4
crossref_primary_10_1109_ACCESS_2023_3236814
crossref_primary_10_3389_fnhum_2023_1180533
crossref_primary_10_1109_ACCESS_2019_2907479
crossref_primary_10_1111_j_1469_8986_2010_01170_x
crossref_primary_10_1145_2723575
crossref_primary_10_3390_e21070646
crossref_primary_10_1038_s41598_022_27361_x
crossref_primary_10_1109_ACCESS_2018_2878144
crossref_primary_10_1109_TBME_2017_2771518
crossref_primary_10_12677_Design_2024_91011
crossref_primary_10_1007_s11760_023_02606_y
crossref_primary_10_3389_fpsyg_2018_00467
crossref_primary_10_1080_07370024_2012_755421
crossref_primary_10_1016_j_apergo_2013_03_028
crossref_primary_10_3390_electronics5030046
crossref_primary_10_1145_2133366_2133373
crossref_primary_10_1007_s11432_010_4001_1
crossref_primary_10_1109_ACCESS_2019_2893202
crossref_primary_10_1016_j_bspc_2022_103580
crossref_primary_10_1016_j_cmpb_2015_07_006
crossref_primary_10_1016_j_cmpb_2012_10_021
crossref_primary_10_3390_s23031608
crossref_primary_10_1145_3301498
crossref_primary_10_7717_peerj_cs_2649
crossref_primary_10_3389_fpsyg_2021_772642
crossref_primary_10_4015_S101623721650040X
crossref_primary_10_1155_2018_5296523
crossref_primary_10_1177_03611981211041594
crossref_primary_10_1109_JIOT_2023_3265768
crossref_primary_10_1007_s12193_012_0093_9
crossref_primary_10_1109_JBHI_2021_3092412
crossref_primary_10_1109_JIOT_2021_3122015
crossref_primary_10_1524_itit_2009_0557
crossref_primary_10_1016_j_ins_2019_09_005
crossref_primary_10_3390_bioengineering10111308
crossref_primary_10_3758_s13428_020_01435_y
crossref_primary_10_3390_brainsci12121680
crossref_primary_10_3389_fncom_2019_00053
crossref_primary_10_1007_s00779_011_0479_9
crossref_primary_10_1007_s42486_020_00043_1
crossref_primary_10_1109_ACCESS_2024_3436556
crossref_primary_10_1109_TSMC_2020_2970905
crossref_primary_10_3389_fnhum_2018_00267
crossref_primary_10_1007_s13218_011_0115_x
crossref_primary_10_3389_fpsyg_2017_02239
crossref_primary_10_1038_s41598_022_07517_5
crossref_primary_10_1541_ieejeiss_133_706
crossref_primary_10_1109_TSMCA_2012_2216869
crossref_primary_10_3390_e24020259
crossref_primary_10_1007_s42600_023_00293_9
crossref_primary_10_1109_T_AFFC_2012_4
crossref_primary_10_1007_s00287_012_0618_3
crossref_primary_10_1007_s11042_020_10341_6
crossref_primary_10_12677_HJBM_2021_112011
crossref_primary_10_1109_TBME_2010_2048568
crossref_primary_10_1186_1687_5281_2013_26
crossref_primary_10_1109_TAFFC_2022_3155604
crossref_primary_10_1109_TAFFC_2016_2582490
crossref_primary_10_1016_j_irbm_2021_06_002
crossref_primary_10_1016_j_ymssp_2022_109675
crossref_primary_10_1007_s00500_020_05338_0
crossref_primary_10_1016_j_neucom_2014_02_057
crossref_primary_10_3389_fict_2018_00017
crossref_primary_10_1007_s11036_024_02393_x
crossref_primary_10_1098_rstb_2009_0139
crossref_primary_10_4258_hir_2018_24_4_309
crossref_primary_10_1109_TAFFC_2015_2444371
crossref_primary_10_1109_TITS_2021_3104827
crossref_primary_10_3390_s19071659
crossref_primary_10_1109_TAFFC_2018_2878029
crossref_primary_10_3390_app13020807
crossref_primary_10_1016_j_intcom_2012_04_003
crossref_primary_10_1109_TMM_2013_2265078
crossref_primary_10_1016_j_eswa_2020_114011
crossref_primary_10_1186_s12911_020_01299_4
crossref_primary_10_1016_j_compbiomed_2017_12_025
crossref_primary_10_1038_s41598_023_36915_6
crossref_primary_10_1177_1729881418806433
crossref_primary_10_1145_2382577_2382582
crossref_primary_10_3390_app12094236
crossref_primary_10_18178_joig_4_2_116_121
crossref_primary_10_1177_0305735620928422
crossref_primary_10_3182_20120215_3_AT_3016_00055
crossref_primary_10_1007_s12652_021_03437_w
crossref_primary_10_1111_exsy_12014
crossref_primary_10_3389_fpsyg_2022_781448
crossref_primary_10_1016_j_eswa_2020_114382
crossref_primary_10_1007_s13246_019_00825_7
crossref_primary_10_1007_s40846_018_0458_y
crossref_primary_10_26599_TST_2022_9010038
crossref_primary_10_1007_s11042_018_6907_3
crossref_primary_10_1007_s43545_022_00335_6
crossref_primary_10_1016_j_cmpb_2013_05_022
crossref_primary_10_1371_journal_pone_0140330
crossref_primary_10_1007_s00426_022_01661_3
crossref_primary_10_3390_s22072538
crossref_primary_10_1145_3719211
crossref_primary_10_3390_s21051777
crossref_primary_10_1109_ACCESS_2021_3055551
crossref_primary_10_1016_j_apacoust_2021_108389
crossref_primary_10_1016_j_ins_2017_11_045
crossref_primary_10_3390_electronics13122324
crossref_primary_10_1631_FITEE_1800101
crossref_primary_10_1109_JSEN_2023_3314718
crossref_primary_10_3389_fncom_2017_00064
crossref_primary_10_3390_s24144721
crossref_primary_10_1155_2014_851796
crossref_primary_10_1109_TAFFC_2017_2764896
crossref_primary_10_3389_fnbot_2019_00046
crossref_primary_10_4316_aece_2012_02008
crossref_primary_10_1109_TIE_2010_2103538
crossref_primary_10_3390_app10103501
crossref_primary_10_1109_ACCESS_2019_2891579
crossref_primary_10_1016_j_cmpb_2014_06_014
crossref_primary_10_1007_s13735_017_0128_9
crossref_primary_10_1016_j_bspc_2020_101902
crossref_primary_10_4218_etrij_2020_0446
crossref_primary_10_3389_fpsyg_2019_01138
crossref_primary_10_21541_apjes_601235
crossref_primary_10_1371_journal_pone_0150584
crossref_primary_10_1371_journal_pone_0223881
crossref_primary_10_1016_j_heliyon_2024_e31485
crossref_primary_10_3390_s19194079
crossref_primary_10_1016_j_chb_2021_106797
crossref_primary_10_1016_j_asoc_2012_01_015
crossref_primary_10_1186_s12911_017_0562_x
crossref_primary_10_1093_scan_nsy076
crossref_primary_10_1109_T_AFFC_2011_30
crossref_primary_10_1007_s41060_024_00649_z
crossref_primary_10_1111_anec_12919
crossref_primary_10_1016_j_eswa_2014_03_050
crossref_primary_10_1016_j_jnca_2019_102423
crossref_primary_10_1145_2168752_2168754
crossref_primary_10_1109_ACCESS_2020_3035540
crossref_primary_10_1016_j_procs_2015_12_222
crossref_primary_10_1109_TMI_2023_3246220
crossref_primary_10_3390_s22155611
crossref_primary_10_1162_leon_a_01344
crossref_primary_10_1080_2326263X_2014_912882
crossref_primary_10_1007_s13246_017_0571_1
crossref_primary_10_1109_T_AFFC_2011_37
crossref_primary_10_1109_ACCESS_2022_3201342
crossref_primary_10_1007_s11042_016_3608_7
crossref_primary_10_1007_s12193_012_0097_5
crossref_primary_10_3390_s18124253
crossref_primary_10_1007_s11042_022_14091_5
crossref_primary_10_1016_j_compbiomed_2020_103875
crossref_primary_10_1109_TAFFC_2015_2496310
crossref_primary_10_1109_TAFFC_2017_2781732
crossref_primary_10_1016_j_compbiomed_2022_106463
crossref_primary_10_1109_TMM_2010_2052592
crossref_primary_10_1016_j_neucom_2022_09_061
crossref_primary_10_1155_2013_937163
crossref_primary_10_3233_THC_213522
crossref_primary_10_1016_j_ijhcs_2012_10_016
crossref_primary_10_1109_TCDS_2024_3383296
crossref_primary_10_3390_jmse10101438
crossref_primary_10_1109_T_AFFC_2012_14
crossref_primary_10_1109_TITB_2011_2169804
crossref_primary_10_1080_10510974_2021_1975139
crossref_primary_10_1186_s12993_018_0149_4
crossref_primary_10_1007_s12530_014_9123_z
crossref_primary_10_1016_j_measurement_2021_109966
crossref_primary_10_3389_fnins_2024_1293962
crossref_primary_10_1109_ACCESS_2023_3265898
crossref_primary_10_1007_s10489_025_06245_3
crossref_primary_10_1007_s11694_018_9754_z
crossref_primary_10_1016_j_csl_2010_10_001
crossref_primary_10_1088_0967_3334_34_1_35
crossref_primary_10_1109_T_AFFC_2011_15
crossref_primary_10_1515_bmt_2013_0118
crossref_primary_10_1109_TCSS_2024_3392569
crossref_primary_10_1109_ACCESS_2019_2904977
crossref_primary_10_1016_j_cmpb_2016_12_005
crossref_primary_10_1007_s00521_024_09426_2
crossref_primary_10_1038_s41598_022_06993_z
crossref_primary_10_1142_S021963521450006X
crossref_primary_10_1145_3332374
crossref_primary_10_5370_KIEE_2014_63_5_696
crossref_primary_10_1016_j_inffus_2024_102338
crossref_primary_10_1109_TMM_2016_2614880
crossref_primary_10_1109_T_AFFC_2011_28
crossref_primary_10_3389_fnhum_2017_00577
crossref_primary_10_1109_TAFFC_2021_3094894
crossref_primary_10_1109_ACCESS_2018_2885279
crossref_primary_10_1109_T_AFFC_2011_25
crossref_primary_10_1109_TAFFC_2018_2884461
crossref_primary_10_1109_ACCESS_2019_2945059
crossref_primary_10_1109_JSEN_2021_3107429
crossref_primary_10_1109_THMS_2024_3430327
crossref_primary_10_3389_fict_2017_00001
crossref_primary_10_1007_s10489_024_05976_z
crossref_primary_10_1080_09298215_2017_1337158
crossref_primary_10_1109_T_AFFC_2011_4
crossref_primary_10_3390_s23198078
crossref_primary_10_1016_j_jneumeth_2024_110348
crossref_primary_10_3389_fnhum_2023_1286621
crossref_primary_10_1109_TAFFC_2019_2901673
crossref_primary_10_1109_TAFFC_2019_2954118
crossref_primary_10_1016_j_compeleceng_2019_106544
crossref_primary_10_1109_TVCG_2024_3372101
crossref_primary_10_1109_TAFFC_2020_2981610
crossref_primary_10_3389_fphys_2018_00525
crossref_primary_10_1109_ACCESS_2023_3279720
crossref_primary_10_1109_TNSRE_2021_3078460
crossref_primary_10_1142_S0129065716500131
crossref_primary_10_3389_frobt_2024_1463477
crossref_primary_10_1007_s10462_023_10690_2
crossref_primary_10_1007_s11517_012_0921_9
crossref_primary_10_1016_j_neucom_2015_07_112
crossref_primary_10_1007_s12559_013_9239_7
crossref_primary_10_1109_ACCESS_2024_3463746
crossref_primary_10_1016_j_future_2021_05_034
crossref_primary_10_1016_j_cmpb_2019_03_015
crossref_primary_10_1109_TAFFC_2022_3220484
crossref_primary_10_35272_jaet_2017_10_4_479
crossref_primary_10_1109_JIOT_2022_3204779
crossref_primary_10_3390_s22208047
crossref_primary_10_1016_j_ijhcs_2014_05_006
crossref_primary_10_4188_jte_66_109
crossref_primary_10_1109_TIM_2016_2640458
crossref_primary_10_3390_s21155015
crossref_primary_10_1016_j_measurement_2014_09_010
crossref_primary_10_32628_CSEIT2410612392
crossref_primary_10_5937_telfor2202073S
crossref_primary_10_3390_s19245524
crossref_primary_10_3390_e24070859
crossref_primary_10_3389_fpsyg_2020_01111
crossref_primary_10_1016_j_bspc_2025_107749
crossref_primary_10_1007_s11760_013_0591_6
crossref_primary_10_1007_s11042_022_12711_8
crossref_primary_10_1016_j_compedu_2023_104784
crossref_primary_10_1038_srep23384
crossref_primary_10_1109_JSEN_2018_2867221
crossref_primary_10_1587_transinf_2017EDL8264
crossref_primary_10_1016_j_eswa_2024_124305
crossref_primary_10_3390_a16030130
crossref_primary_10_1051_matecconf_20166103012
crossref_primary_10_1589_jpts_25_753
crossref_primary_10_1016_j_bspc_2020_102251
crossref_primary_10_1093_iwc_iwu014
crossref_primary_10_2478_jaiscr_2021_0001
crossref_primary_10_3390_electronics6020044
crossref_primary_10_1007_s11280_019_00713_w
crossref_primary_10_1109_JBHI_2018_2878846
crossref_primary_10_3390_s18113886
crossref_primary_10_1587_transinf_2017EDP7067
crossref_primary_10_1016_j_bica_2018_04_012
crossref_primary_10_1109_JBHI_2024_3419043
crossref_primary_10_1016_j_ijhcs_2015_05_002
crossref_primary_10_3390_s19030460
crossref_primary_10_1080_13467581_2019_1687090
crossref_primary_10_1109_TAFFC_2024_3409357
crossref_primary_10_1109_THMS_2017_2681434
crossref_primary_10_1007_s00500_016_2110_5
crossref_primary_10_1093_iwc_iwt039
crossref_primary_10_1016_j_artres_2017_02_003
crossref_primary_10_1109_TAFFC_2017_2702749
crossref_primary_10_1109_TAFFC_2017_2737984
crossref_primary_10_1145_3236621
crossref_primary_10_1016_j_jth_2017_11_001
crossref_primary_10_1016_j_eswa_2020_114516
crossref_primary_10_1038_s41598_022_10459_7
crossref_primary_10_1145_3524499
crossref_primary_10_1109_TCSS_2022_3219825
crossref_primary_10_1109_TAFFC_2016_2625250
crossref_primary_10_1109_TAFFC_2019_2901456
crossref_primary_10_3390_s18061905
crossref_primary_10_1109_TCDS_2021_3071170
crossref_primary_10_1007_s13246_017_0530_x
crossref_primary_10_1145_3550307
crossref_primary_10_1007_s12193_013_0133_0
crossref_primary_10_1109_JSTSP_2016_2638538
crossref_primary_10_3389_fnhum_2022_813684
crossref_primary_10_1017_S1351324922000535
crossref_primary_10_1080_14626268_2017_1407344
crossref_primary_10_1016_j_eswa_2019_112890
crossref_primary_10_1088_1757_899X_782_3_032005
crossref_primary_10_1111_psyp_14350
crossref_primary_10_1007_s10209_016_0469_9
crossref_primary_10_1016_j_bbe_2018_07_001
crossref_primary_10_1016_j_chb_2018_05_015
crossref_primary_10_1007_s11042_022_14051_z
crossref_primary_10_1109_ACCESS_2019_2944001
crossref_primary_10_1007_s12559_016_9383_y
crossref_primary_10_3233_AIC_201576
crossref_primary_10_1002_widm_1328
crossref_primary_10_1145_3478121
crossref_primary_10_1016_j_ipm_2019_102185
crossref_primary_10_1007_s11257_011_9107_7
crossref_primary_10_1088_1742_6596_1950_1_012047
crossref_primary_10_3390_e24091281
crossref_primary_10_1007_s42979_022_01116_x
crossref_primary_10_1093_comjnl_bxaa125
crossref_primary_10_1109_TAFFC_2018_2801811
crossref_primary_10_1145_3652149
crossref_primary_10_1162_NECO_a_00523
crossref_primary_10_4018_jgcms_2010100102
crossref_primary_10_1016_j_bspc_2014_12_006
crossref_primary_10_15290_cr_2020_31_4_03
crossref_primary_10_3390_sym8090086
crossref_primary_10_1109_TBME_2009_2035926
crossref_primary_10_1109_TAFFC_2015_2432810
crossref_primary_10_3390_s20020365
crossref_primary_10_1186_s40708_021_00141_5
crossref_primary_10_1016_j_ijhcs_2022_102791
crossref_primary_10_1109_ACCESS_2021_3121543
crossref_primary_10_1145_3635712
crossref_primary_10_1016_j_neunet_2020_08_009
crossref_primary_10_3390_jfmk4040071
crossref_primary_10_1016_j_bspc_2021_102863
crossref_primary_10_1145_3233184
crossref_primary_10_1145_3491208
crossref_primary_10_1007_s10111_017_0450_2
crossref_primary_10_3390_s20020496
crossref_primary_10_1007_s10055_024_00989_y
crossref_primary_10_1016_j_inffus_2023_102129
crossref_primary_10_1109_T_AFFC_2012_28
crossref_primary_10_3390_brainsci11111392
crossref_primary_10_3390_brainsci11030378
crossref_primary_10_1109_TAFFC_2015_2404352
crossref_primary_10_1145_3505560
crossref_primary_10_1080_02533839_2013_799946
crossref_primary_10_1016_j_imavis_2012_06_016
crossref_primary_10_1016_j_inffus_2017_05_005
crossref_primary_10_1016_j_neucom_2014_10_008
crossref_primary_10_1016_j_neuroimage_2013_11_007
crossref_primary_10_1016_j_inffus_2020_01_011
crossref_primary_10_1016_j_neucom_2019_05_061
crossref_primary_10_1109_TBME_2024_3508097
crossref_primary_10_3390_s21031018
crossref_primary_10_1002_hfm_20627
crossref_primary_10_20965_jaciii_2011_p0582
Cites_doi 10.1109/34.824819
10.1037/0022-3514.62.6.972
10.1007/BF02344719
10.2307/40285496
10.1007/978-3-540-24842-2_4
10.1002/0471660264
10.1109/TBME.1985.325532
10.1214/aos/1028144844
10.1111/1469-8986.3820275
10.7208/chicago/9780226521374.001.0001
10.1111/j.1469-8986.1989.tb03163.x
10.1109/34.531802
10.1007/s10111-003-0143-x
10.1103/PhysRevE.71.021906
10.1109/ICASSP.1998.679699
10.1111/j.1469-8986.1990.tb02330.x
10.1037/h0057490
10.1016/j.brat.2003.08.006
10.1080/02699939508408966
10.1109/34.954607
10.1016/0162-3095(90)90017-Z
10.1152/physiologyonline.1999.14.3.111
10.1093/jmt/26.4.168
10.1080/08839519508945477
10.2307/1415404
10.1109/34.574797
10.1037/h0046234
10.1042/cs0910201
10.1037/0003-066X.50.5.372
10.1037/1196-1961.51.4.336
10.1080/026999398379718
10.1152/ajpheart.2000.278.6.h2039
10.1109/79.911197
10.1037/h0054570
ContentType Journal Article
Copyright 2009 INIST-CNRS
Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2008
Copyright_xml – notice: 2009 INIST-CNRS
– notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2008
DBID 97E
RIA
RIE
AAYXX
CITATION
IQODW
CGR
CUY
CVF
ECM
EIF
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
F28
FR3
7X8
DOI 10.1109/TPAMI.2008.26
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
Pascal-Francis
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
Engineering Research Database
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList Technology Research Database
MEDLINE
Technology Research Database
Technology Research Database

MEDLINE - Academic
Technology Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
Applied Sciences
Music
EISSN 2160-9292
1939-3539
EndPage 2083
ExternalDocumentID 2322675031
18988943
20841939
10_1109_TPAMI_2008_26
4441720
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
-DZ
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ADRHT
AENEX
AETEA
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
FA8
HZ~
H~9
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNI
RNS
RXW
RZB
TAE
TN5
UHB
VH1
XJT
~02
AAYOK
AAYXX
CITATION
RIG
IQODW
CGR
CUY
CVF
ECM
EIF
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
F28
FR3
7X8
ID FETCH-LOGICAL-c569t-816884a1cfd10c9bcf35f8119a18721d6c950b27a71fd3f5fb64cdf0fc1487323
IEDL.DBID RIE
ISSN 0162-8828
IngestDate Fri Jul 11 08:29:26 EDT 2025
Thu Jul 10 22:49:53 EDT 2025
Thu Jul 10 22:08:50 EDT 2025
Fri Jul 11 00:03:39 EDT 2025
Sun Jun 29 12:46:27 EDT 2025
Mon Jul 21 06:02:21 EDT 2025
Mon Jul 21 09:15:07 EDT 2025
Thu Apr 24 23:03:24 EDT 2025
Sun Jul 06 05:07:15 EDT 2025
Wed Aug 27 02:47:51 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 12
Keywords synthesis
and processing
Human-centered computing
Feature evaluation and selection
User/Machine Systems
Theory and methods
Classifier design and evaluation
Robotics
Signal processing
Methodologies and techniques
Interaction styles
Signal analysis
Pattern analysis
Autonomous system
Time frequency domain method
Man machine system
affective computing
respiration
Modeling
electromyogram
Audiovisual
Induction method
electrocardiogram
User interface
Musical acoustics
Classification
Facies
Electrocardiography
Audiovisual equipment
Time analysis
Pattern extraction
physiological signal
Emotion recognition
Discriminant analysis
autonomic nervous system
skin conductance
biosignal
Frequency domain method
Emotion emotionality
valence
human-computer interaction
Multiscale method
Biosensor
Music
musical emotion
Skin
Automatic recognition
Artificial intelligence
Multiclass
Facial expression
arousal
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
CC BY 4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c569t-816884a1cfd10c9bcf35f8119a18721d6c950b27a71fd3f5fb64cdf0fc1487323
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
OpenAccessLink https://opus.bibliothek.uni-augsburg.de/opus4/files/46592/46592.pdf
PMID 18988943
PQID 862251149
PQPubID 23500
PageCount 17
ParticipantIDs ieee_primary_4441720
proquest_miscellaneous_1671261690
crossref_primary_10_1109_TPAMI_2008_26
proquest_journals_862251149
proquest_miscellaneous_903629286
pubmed_primary_18988943
proquest_miscellaneous_34435618
proquest_miscellaneous_69761595
pascalfrancis_primary_20841939
crossref_citationtrail_10_1109_TPAMI_2008_26
PublicationCentury 2000
PublicationDate 2008-12-01
PublicationDateYYYYMMDD 2008-12-01
PublicationDate_xml – month: 12
  year: 2008
  text: 2008-12-01
  day: 01
PublicationDecade 2000
PublicationPlace Los Alamitos, CA
PublicationPlace_xml – name: Los Alamitos, CA
– name: United States
– name: New York
PublicationTitle IEEE transactions on pattern analysis and machine intelligence
PublicationTitleAbbrev TPAMI
PublicationTitleAlternate IEEE Trans Pattern Anal Mach Intell
PublicationYear 2008
Publisher IEEE
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: IEEE Computer Society
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
Guzzetta (ref17) 1989; 18
ref34
ref37
ref14
ref36
ref30
ref11
ref33
ref10
(ref23) 1995
ref32
ref2
ref39
ref16
ref38
ref19
ref18
Ferri (ref43) 1994
James (ref1) 1890
Mcllory (ref35) 1990
ref24
ref45
Kivy (ref12) 1989
ref26
ref25
ref47
ref20
ref42
LeDoux (ref29) 1992
ref41
ref22
ref44
ref21
Ekman (ref4) 1989
Melin (ref31) 1997; 11
ref28
ref27
Vaitl (ref15) 1993
ref8
ref7
ref9
Thuraisingham (ref40) 2005
ref3
ref5
Friedman (ref46) 1996
Cacioppo (ref6) 1993
References_xml – ident: ref33
  doi: 10.1109/34.824819
– start-page: 119
  volume-title: Handbook of Emotions
  year: 1993
  ident: ref6
  article-title: The Psychophysiology of Emotion
– volume: 11
  start-page: 238
  issue: 3
  year: 1997
  ident: ref31
  article-title: A Biopsychosocial Approach to Work-Stress and Musculoskeletal Disorders
  publication-title: J. Psychophysiology
– ident: ref5
  doi: 10.1037/0022-3514.62.6.972
– ident: ref24
  doi: 10.1007/BF02344719
– ident: ref11
  doi: 10.2307/40285496
– start-page: 339
  volume-title: The Amygdala: Neurobiological Aspects of Emotion, Memory, and Mental Dysfunction
  year: 1992
  ident: ref29
– ident: ref22
  doi: 10.1007/978-3-540-24842-2_4
– ident: ref47
  doi: 10.1002/0471660264
– ident: ref36
  doi: 10.1109/TBME.1985.325532
– ident: ref45
  doi: 10.1214/aos/1028144844
– ident: ref8
  doi: 10.1111/1469-8986.3820275
– ident: ref14
  doi: 10.7208/chicago/9780226521374.001.0001
– ident: ref10
  doi: 10.1111/j.1469-8986.1989.tb03163.x
– ident: ref44
  doi: 10.1109/34.531802
– ident: ref20
  doi: 10.1007/s10111-003-0143-x
– ident: ref39
  doi: 10.1103/PhysRevE.71.021906
– start-page: 143
  volume-title: Handbook of Social Psychophysiology
  year: 1989
  ident: ref4
  article-title: The Argument and Evidence about Universals in Facial Expressions of Emotion
– start-page: 403
  volume-title: Pattern Recognition in Practice IV: Multiple Paradigms, Comparative Studies, and Hybrid Systems
  year: 1994
  ident: ref43
  article-title: Comparative Study of Techniques for Large-Scale Feature Selection
– year: 2005
  ident: ref40
  article-title: On Multiscale Entropy Analysis for Physiological Data
  publication-title: Physica A
– ident: ref18
  doi: 10.1109/ICASSP.1998.679699
– ident: ref7
  doi: 10.1111/j.1469-8986.1990.tb02330.x
– volume-title: Sound Sentiment: An Essay on the Musical Emotions
  year: 1989
  ident: ref12
– ident: ref32
  doi: 10.1037/h0057490
– ident: ref30
  doi: 10.1016/j.brat.2003.08.006
– ident: ref21
  doi: 10.1080/02699939508408966
– volume: 18
  start-page: 609
  issue: 6
  year: 1989
  ident: ref17
  article-title: Effects of Relaxation and Music Therapy on Patients in a Coronary Care Unit with Presumptive Acute Myocardial Infarction
  publication-title: Heart and Lung: J. Critical Care
– ident: ref19
  doi: 10.1109/34.954607
– ident: ref26
  doi: 10.1016/0162-3095(90)90017-Z
– ident: ref41
  doi: 10.1152/physiologyonline.1999.14.3.111
– ident: ref16
  doi: 10.1093/jmt/26.4.168
– volume-title: Clinical Cardiology
  year: 1990
  ident: ref35
– ident: ref34
  doi: 10.1080/08839519508945477
– ident: ref2
  doi: 10.2307/1415404
– ident: ref42
  doi: 10.1109/34.574797
– ident: ref9
  doi: 10.1037/h0046234
– volume-title: technical report, Dept. of Statistics, Stanford Univ.,
  year: 1996
  ident: ref46
  article-title: Another Approach to Polychotomous Classification
– ident: ref37
  doi: 10.1042/cs0910201
– ident: ref25
  doi: 10.1037/0003-066X.50.5.372
– ident: ref13
  doi: 10.1037/1196-1961.51.4.336
– ident: ref28
  doi: 10.1080/026999398379718
– ident: ref38
  doi: 10.1152/ajpheart.2000.278.6.h2039
– volume-title: The Principles of Psychology
  year: 1890
  ident: ref1
– ident: ref3
  doi: 10.1109/79.911197
– ident: ref27
  doi: 10.1037/h0054570
– start-page: 169
  volume-title: The Structure of Emotion: Psychophysiological, Cognitive, and Clinical Aspects
  year: 1993
  ident: ref15
  article-title: Prompts-Leitmotif-Emotion: Play It Again, Richard Wagner
– volume-title: The International Affective Picture System: Digitized Photographs
  year: 1995
  ident: ref23
SSID ssj0014503
Score 2.5217464
Snippet Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or...
In order to collect a physiological data set from multiple subjects over many weeks, we used a musical induction method that spontaneously leads subjects to...
SourceID proquest
pubmed
pascalfrancis
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2067
SubjectTerms Adaptation, Physiological - physiology
Algorithms
and processing
Applied sciences
Arousal
Arousal - physiology
Artificial Intelligence
Auditory Perception - physiology
Biosensors
Channels
Classification
Classifier design and evaluation
Computer science; control theory; systems
Computer systems and distributed systems. User interface
Conductivity measurement
Discriminant analysis
Disk recording
Emotion recognition
Emotions
Emotions - physiology
Entropy
Exact sciences and technology
Feature evaluation and selection
Frequency
Human-centered computing
Humans
Intelligence
Interaction styles
Laboratories
Methodologies and techniques
Monitoring, Physiologic - methods
Music
Pattern analysis
Pattern Recognition, Automated - methods
Pattern recognition. Digital image processing. Computational geometry
Recognition
Robotics
Signal analysis
Signal processing
Skin
Software
Speech
Studies
synthesis
Theory and methods
User/Machine Systems
Title Emotion recognition based on physiological changes in music listening
URI https://ieeexplore.ieee.org/document/4441720
https://www.ncbi.nlm.nih.gov/pubmed/18988943
https://www.proquest.com/docview/862251149
https://www.proquest.com/docview/1671261690
https://www.proquest.com/docview/34435618
https://www.proquest.com/docview/69761595
https://www.proquest.com/docview/903629286
Volume 30
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1La9wwEB7SnNpD8-rDSZuqUHqKN37IsnQMZUNa2NJDArkZWQ8oTbylu3vpr8-MZDtJqSE3gcbYGs1Y80mj-QA-eS6MLjRVuMxcyi13qeStTTNNWQ2traWlC86L7-Liin-7rq634GS8C-OcC8lnbkbNcJZvl2ZDW2WnnPiyCgTozxC4xbta44kBrwILMkYw6OEII-7raZ5e_jhbfI1pk0WgLJJKUtnxR0tR4FahzEi9QuX4yGoxHXaG5ed8BxbDh8esk1-zzbqdmb__1HR86sh24WUfh7KzaDh7sOW6fdgZOB5Y7_L78OJBwcIDmM8j6w8b846wTQuhZdgIuyTDz5TFK8Ur9rNjt0QmzW7Iomgb5hVcnc8vv1ykPRFDaiqh1ilxc0iuc-NtnhnVGl9WXua50rlEBGmFUVXWFrWuc29LX_lWcGN95g2Crbosytew3S079xZYyVv8i0gjPAIZW1Xa4Mh1KfEJKpRoEjgZ5qQxfZVyIsu4aQJayVQTZjOyZxYigc-j-O9YnmNK8IC0Pgr1Ck_g-NGEj_1FJjnGtiqBo8ECmt67Vw2iQEJmHHs_jr3olnTWoju33KyaXNQ5glOh8BUfJmRKjrGqyOW0hMBgEePNKgE2IaEoAlGFxAG-ifZ5r4fezA__P_QjeB5yX0JqzjvYXv_ZuPcYYK3b4-BZd4WUIJ0
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB5V5QAcKLQ8lkJrJMSp2caJ7djHCm21hW7FYSv1Fjl-SIiSRezuhV_P2E5SiojEzZInSjyeieezx_MBvPdMGF3oUOEydxmzzGWSNTbLdchqaGwlbbjgvLgS82v26Ybf7MDJcBfGOReTz9w0NONZvl2ZbdgqO2WBL6tAgP4A131O022t4cyA8ciDjDEM-jgCibuKmqfLL2eLi5Q4WUTSIqlkKDx-bzGK7CohN1KvUT0-8VqMB55xATrfg0X_6Snv5Nt0u2mm5tdfVR3_d2xP4UkXiZKzZDrPYMe1-7DXszyQzun34fEfJQsPYDZLvD9kyDzCdlgKLcFG3Cfpf6ckXSpek68t-R7opMltsKmwEfMcrs9ny4_zrKNiyAwXapMFdg7JNDXe0tyoxviSe0mp0lQihrTCKJ43RaUr6m3puW8EM9bn3iDcqsqifAG77ap1r4CUrMH_iDTCI5SxnGuDI9elxCdCqUQzgZN-TmrT1SkPdBm3dcQruarjbCb-zEJM4MMg_iMV6BgTPAhaH4Q6hU_g6N6ED_1FLhlGt2oCh70F1J1_r2vEgQGbMex9N_SiY4bTFt261XZdU1FRhKdC4SuOR2RKhtGqoHJcQmC4iBEnnwAZkVAhBlGFxAG-TPZ5p4fOzF__e-jH8HC-XFzWlxdXnw_hUcyEiYk6b2B383Pr3mK4tWmOopf9Bq2FI-Y
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Emotion+recognition+based+on+physiological+changes+in+music+listening&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Kim%2C+Jonghwa&rft.au=Andr%C3%A9%2C+Elisabeth&rft.date=2008-12-01&rft.issn=0162-8828&rft.volume=30&rft.issue=12&rft.spage=2067&rft_id=info:doi/10.1109%2FTPAMI.2008.26&rft_id=info%3Apmid%2F18988943&rft.externalDocID=18988943
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon