Compound facial expressions of emotion

Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the st...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the National Academy of Sciences - PNAS Vol. 111; no. 15; pp. E1454 - E1462
Main Authors Du, Shichuan, Tao, Yong, Martinez, Aleix M.
Format Journal Article
LanguageEnglish
Published United States National Academy of Sciences 15.04.2014
National Acad Sciences
SeriesPNAS Plus
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories—happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.
AbstractList Though people regularly recognize many distinct emotions, for the most part, research studies have been limited to six basic categories—happiness, surprise, sadness, anger, fear, and disgust; the reason for this is grounded in the assumption that only these six categories are differentially represented by our cognitive and social systems. The results reported herein propound otherwise, suggesting that a larger number of categories is used by humans. Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories—happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.
Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories -- happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another. [PUBLICATION ABSTRACT]
Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories--happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories--happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.
Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories—happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.
Author Aleix M. Martinez
Yong Tao
Shichuan Du
Author_xml – sequence: 1
  givenname: Shichuan
  surname: Du
  fullname: Du, Shichuan
  organization: Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, The Ohio State University, Columbus, OH 43210
– sequence: 2
  givenname: Yong
  surname: Tao
  fullname: Tao, Yong
  organization: Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, The Ohio State University, Columbus, OH 43210
– sequence: 3
  givenname: Aleix M.
  surname: Martinez
  fullname: Martinez, Aleix M.
  organization: Department of Electrical and Computer Engineering, and Center for Cognitive and Brain Sciences, The Ohio State University, Columbus, OH 43210
BackLink https://www.ncbi.nlm.nih.gov/pubmed/24706770$$D View this record in MEDLINE/PubMed
BookMark eNqFkc1v1DAQxS1URLeFMzeIhIS4pJ3xR2xfkNCqfEiVOEDPlpO1i6vEDnGC4L_H0S4L9AAnjzS_N34z74ycxBQdIU8RLhAkuxyjzRfIKGVCIOIDskHQWDdcwwnZAFBZK075KTnL-Q4AtFDwiJxSLqGREjbk5TYNY1rirvK2C7av3PdxcjmHFHOVfOWGNJf6MXnobZ_dk8N7Tm7eXn3evq-vP777sH1zXXei0XPdtqp1XaMFdkJy3XrGFOcCmNJOKaUl96W3YwLB7ToPWlkBXqhGunZHEdg5eb2fOy7tUBAX58n2ZpzCYKcfJtlg_u7E8MXcpm-GaU0bqsuAV4cBU_q6uDybIeTO9b2NLi3ZoAIGWguO_0cFSqYVV6ygL-6hd2mZYrlEoSgAUg2r-Wd_mj-6_nXtAlzugW5KOU_OHxEEs-Zp1jzN7zyLQtxTdGG2ayJl-9D_Q1cdrKyN4y-Ixa-5Qi54QZ7vEW-TsbdTyObmEwVsyjq85KbYT3DutZY
CitedBy_id crossref_primary_10_3390_s24082450
crossref_primary_10_1016_j_cub_2021_10_035
crossref_primary_10_1016_j_eswa_2024_125358
crossref_primary_10_1016_j_chb_2019_02_001
crossref_primary_10_1016_j_neucom_2019_05_018
crossref_primary_10_1109_TMM_2021_3116434
crossref_primary_10_1007_s12046_022_01943_x
crossref_primary_10_1109_TAFFC_2022_3176135
crossref_primary_10_1142_S0219467822500498
crossref_primary_10_1049_el_2018_6932
crossref_primary_10_1016_j_bspc_2024_106339
crossref_primary_10_1145_2661229_2661287
crossref_primary_10_1007_s11229_019_02110_2
crossref_primary_10_1145_3631133
crossref_primary_10_1177_1754073914554775
crossref_primary_10_3390_electronics13142791
crossref_primary_10_1016_j_tics_2020_02_001
crossref_primary_10_1109_TAFFC_2018_2822303
crossref_primary_10_1111_cgf_13647
crossref_primary_10_1016_j_procs_2016_04_071
crossref_primary_10_1371_journal_pone_0177239
crossref_primary_10_1145_3534597
crossref_primary_10_1007_s12626_024_00177_z
crossref_primary_10_1109_TAFFC_2023_3280530
crossref_primary_10_1109_TAFFC_2021_3135516
crossref_primary_10_1590_1982_7849rac2018170223
crossref_primary_10_3389_fpsyg_2021_638398
crossref_primary_10_1186_s13673_020_0208_3
crossref_primary_10_1016_j_actpsy_2024_104569
crossref_primary_10_1016_j_copsyc_2017_06_009
crossref_primary_10_1109_TIP_2024_3374116
crossref_primary_10_1177_1754073914554783
crossref_primary_10_1109_TAFFC_2023_3266808
crossref_primary_10_1080_09540261_2023_2174413
crossref_primary_10_1109_JPROC_2023_3309299
crossref_primary_10_3366_cor_2019_0167
crossref_primary_10_1016_j_cag_2023_07_014
crossref_primary_10_1109_TMM_2019_2933338
crossref_primary_10_3389_fpsyg_2021_605928
crossref_primary_10_1007_s11031_024_10076_z
crossref_primary_10_1109_TMM_2022_3172548
crossref_primary_10_1007_s10044_021_00988_8
crossref_primary_10_1016_j_cortex_2019_11_021
crossref_primary_10_1016_j_isci_2024_111401
crossref_primary_10_9728_dcs_2022_23_1_117
crossref_primary_10_1007_s10919_019_00294_2
crossref_primary_10_1109_TNNLS_2017_2752003
crossref_primary_10_1016_j_jvcir_2020_102949
crossref_primary_10_3389_fpsyg_2020_00446
crossref_primary_10_7202_1039262ar
crossref_primary_10_1007_s11042_022_14102_5
crossref_primary_10_1080_02699931_2015_1133401
crossref_primary_10_1109_TIP_2016_2570550
crossref_primary_10_3389_fpsyg_2019_01606
crossref_primary_10_3390_s19245553
crossref_primary_10_1109_TNNLS_2015_2477321
crossref_primary_10_1145_3593238
crossref_primary_10_1038_s41559_019_0865_7
crossref_primary_10_1093_cercor_bhab307
crossref_primary_10_1109_TCSVT_2021_3103760
crossref_primary_10_1126_sciadv_abq8421
crossref_primary_10_1080_1047840X_2017_1256129
crossref_primary_10_1038_s41598_023_45779_9
crossref_primary_10_1109_TCYB_2020_3036935
crossref_primary_10_1007_s11760_025_03822_4
crossref_primary_10_1016_j_jvcir_2024_104260
crossref_primary_10_1145_3272127_3275075
crossref_primary_10_1016_j_foodqual_2019_05_011
crossref_primary_10_1145_3355089_3356568
crossref_primary_10_1016_j_jvcir_2024_104264
crossref_primary_10_1155_2016_5182768
crossref_primary_10_1109_TIFS_2020_3007327
crossref_primary_10_1038_s41598_021_95510_9
crossref_primary_10_1109_TAFFC_2022_3182342
crossref_primary_10_1007_s10489_021_02254_0
crossref_primary_10_1017_S0954579419000312
crossref_primary_10_1080_02699931_2016_1152231
crossref_primary_10_1108_EJM_11_2021_0892
crossref_primary_10_1007_s11042_020_09405_4
crossref_primary_10_1007_s10044_021_01024_5
crossref_primary_10_1109_TAFFC_2021_3063387
crossref_primary_10_3390_f13081192
crossref_primary_10_1016_j_socec_2024_102312
crossref_primary_10_1007_s11042_018_6402_x
crossref_primary_10_1073_pnas_1716084115
crossref_primary_10_1109_TMM_2024_3374573
crossref_primary_10_3390_diagnostics12051138
crossref_primary_10_3389_fcomp_2019_00011
crossref_primary_10_1097_PRS_0000000000005164
crossref_primary_10_1007_s42154_023_00270_z
crossref_primary_10_1109_TAFFC_2014_2370044
crossref_primary_10_1145_3012941
crossref_primary_10_1007_s11042_022_14289_7
crossref_primary_10_1007_s11263_017_1010_1
crossref_primary_10_1167_tvst_9_8_31
crossref_primary_10_1016_j_ymeth_2016_01_017
crossref_primary_10_1007_s10919_019_00293_3
crossref_primary_10_1080_10508406_2021_1964506
crossref_primary_10_29038_eejpl_2020_7_1_miz
crossref_primary_10_59763_mam_aeq_v5i_55
crossref_primary_10_3390_s22041524
crossref_primary_10_1017_BrImp_2016_29
crossref_primary_10_3758_s13414_021_02281_6
crossref_primary_10_1007_s00138_023_01490_3
crossref_primary_10_1038_s41467_021_25352_6
crossref_primary_10_1007_s11263_019_01210_3
crossref_primary_10_1145_3478085
crossref_primary_10_1016_j_jksuci_2021_06_012
crossref_primary_10_1080_03772063_2020_1756471
crossref_primary_10_1145_3341198
crossref_primary_10_1002_aur_1468
crossref_primary_10_1109_TAFFC_2018_2887267
crossref_primary_10_1016_j_actpsy_2025_104782
crossref_primary_10_1038_s41597_023_02701_2
crossref_primary_10_1016_j_ijhcs_2017_06_001
crossref_primary_10_35193_bseufbd_645138
crossref_primary_10_1016_j_imavis_2018_09_007
crossref_primary_10_1109_TAFFC_2021_3077248
crossref_primary_10_1371_journal_pone_0227877
crossref_primary_10_1109_TIM_2021_3060564
crossref_primary_10_2139_ssrn_2354758
crossref_primary_10_4236_sn_2016_51004
crossref_primary_10_3390_sym14122492
crossref_primary_10_3389_fpsyg_2023_1158136
crossref_primary_10_1007_s12193_020_00363_7
crossref_primary_10_1109_TMM_2019_2916063
crossref_primary_10_1016_j_aej_2020_10_061
crossref_primary_10_1109_TNSRE_2023_3305351
crossref_primary_10_1177_1529100619850176
crossref_primary_10_1016_j_imavis_2018_09_014
crossref_primary_10_1016_j_jphysparis_2015_12_003
crossref_primary_10_1109_ACCESS_2022_3172297
crossref_primary_10_1016_j_neucom_2020_10_082
crossref_primary_10_1007_s11042_024_20138_6
crossref_primary_10_1111_cogs_12548
crossref_primary_10_1109_TAFFC_2022_3201290
crossref_primary_10_1016_j_chb_2025_108638
crossref_primary_10_1371_journal_pone_0167991
crossref_primary_10_1177_03010066221077573
crossref_primary_10_3390_s24248133
crossref_primary_10_1007_s42979_020_00263_3
crossref_primary_10_1016_j_biopsycho_2019_107723
crossref_primary_10_1016_j_psyneuen_2018_11_038
crossref_primary_10_1007_s11760_020_01759_4
crossref_primary_10_1177_00220221221095208
crossref_primary_10_1162_ARTL_a_00238
crossref_primary_10_1007_s00266_020_02045_x
crossref_primary_10_1007_s10339_019_00923_0
crossref_primary_10_3390_app13137638
crossref_primary_10_1109_ACCESS_2020_3015917
crossref_primary_10_1523_JNEUROSCI_1704_15_2016
crossref_primary_10_1109_TAFFC_2022_3215918
crossref_primary_10_1016_j_cobme_2018_12_002
crossref_primary_10_1109_TPAMI_2018_2868952
crossref_primary_10_1080_21548455_2022_2048119
crossref_primary_10_3389_fpsyg_2023_1221081
crossref_primary_10_3390_s22103729
crossref_primary_10_1007_s41233_023_00054_7
crossref_primary_10_1038_s41598_023_32659_5
crossref_primary_10_1016_j_neucom_2024_128536
crossref_primary_10_1109_TPAMI_2015_2481404
crossref_primary_10_1109_TAFFC_2018_2874986
crossref_primary_10_1108_JWL_04_2021_0043
crossref_primary_10_2196_26760
crossref_primary_10_3390_app122412852
crossref_primary_10_1007_s11245_022_09816_y
crossref_primary_10_3390_s18113993
crossref_primary_10_1109_TAFFC_2017_2753235
crossref_primary_10_3390_math10030406
crossref_primary_10_1371_journal_pone_0245777
crossref_primary_10_1109_LSP_2024_3364055
crossref_primary_10_1145_3392866
crossref_primary_10_1007_s13042_021_01413_6
crossref_primary_10_1111_sltb_12194
crossref_primary_10_1177_13623613211068221
crossref_primary_10_1016_j_inffus_2024_102753
crossref_primary_10_1038_s41746_021_00502_8
crossref_primary_10_3390_app142311235
crossref_primary_10_1109_TCSVT_2023_3255243
crossref_primary_10_1371_journal_pone_0230039
crossref_primary_10_38124_ijisrt_IJISRT24MAY1322
crossref_primary_10_1007_s12559_021_09936_4
crossref_primary_10_1109_TCSVT_2024_3491098
crossref_primary_10_1109_ACCESS_2023_3264268
crossref_primary_10_3389_fpain_2024_1372814
crossref_primary_10_1007_s11554_023_01310_x
crossref_primary_10_1038_s41598_023_43716_4
crossref_primary_10_1109_TMC_2020_3001989
crossref_primary_10_1177_01461672211065923
crossref_primary_10_3389_fpsyg_2022_963666
crossref_primary_10_1002_hrdq_21425
crossref_primary_10_1007_s11263_018_1131_1
crossref_primary_10_1080_13218719_2017_1260619
crossref_primary_10_1371_journal_pone_0260814
crossref_primary_10_1007_s12193_021_00364_0
crossref_primary_10_1007_s12646_020_00574_8
crossref_primary_10_1007_s11042_023_16822_8
crossref_primary_10_1016_j_cognition_2016_02_004
crossref_primary_10_1057_s41599_024_02869_x
crossref_primary_10_1117_1_JEI_32_4_040901
crossref_primary_10_3390_su14073817
crossref_primary_10_1093_chemse_bjy029
crossref_primary_10_1109_ACCESS_2021_3091289
crossref_primary_10_3390_s19092140
crossref_primary_10_1016_j_neubiorev_2024_105684
crossref_primary_10_1111_pala_12582
crossref_primary_10_1109_ACCESS_2019_2945423
crossref_primary_10_1109_TIP_2018_2868382
crossref_primary_10_3390_app13063839
crossref_primary_10_1016_j_neubiorev_2017_03_006
crossref_primary_10_1002_aur_2642
crossref_primary_10_1007_s00213_024_06725_3
crossref_primary_10_1038_s41598_018_20567_y
crossref_primary_10_1016_j_patrec_2023_01_001
crossref_primary_10_1016_j_scitotenv_2022_160607
crossref_primary_10_37052_jb24_2_no5
crossref_primary_10_1073_pnas_1908964116
crossref_primary_10_3389_fpsyg_2018_00938
crossref_primary_10_1007_s00426_022_01669_9
crossref_primary_10_1016_j_physbeh_2024_114561
crossref_primary_10_1109_TAFFC_2024_3370103
crossref_primary_10_1016_j_jare_2022_04_009
crossref_primary_10_1109_TIP_2016_2594486
crossref_primary_10_1007_s11042_020_09451_y
crossref_primary_10_1080_13682199_2023_2176735
crossref_primary_10_1371_journal_pone_0211735
crossref_primary_10_1007_s10044_024_01369_7
crossref_primary_10_1109_TCYB_2020_3003502
crossref_primary_10_2466_27_24_PMS_120v12x5
crossref_primary_10_3389_fphy_2021_664948
crossref_primary_10_1016_j_visres_2016_07_002
crossref_primary_10_1371_journal_pone_0118211
crossref_primary_10_1016_j_bspc_2022_104209
crossref_primary_10_1109_ACCESS_2019_2907271
crossref_primary_10_3390_app10196659
crossref_primary_10_12968_eyed_2017_19_4_38
crossref_primary_10_1016_j_patrec_2022_02_010
crossref_primary_10_1007_s10880_022_09856_x
crossref_primary_10_1016_j_cag_2023_08_021
crossref_primary_10_1109_TAFFC_2019_2914654
crossref_primary_10_1111_exsy_13724
crossref_primary_10_3389_fnhum_2015_00112
crossref_primary_10_1371_journal_pone_0105144
crossref_primary_10_54047_bibted_1206885
crossref_primary_10_1016_j_eswa_2018_08_047
crossref_primary_10_1109_ACCESS_2024_3383143
crossref_primary_10_1109_TAFFC_2020_2988455
crossref_primary_10_1109_TAFFC_2017_2740923
crossref_primary_10_1177_0963721417698535
crossref_primary_10_1109_TPAMI_2019_2916866
crossref_primary_10_1111_bdi_12312
crossref_primary_10_1007_s11042_023_16156_5
crossref_primary_10_1155_2021_9116502
crossref_primary_10_1007_s11023_019_09497_4
crossref_primary_10_1016_j_jbusres_2025_115193
crossref_primary_10_1016_j_schres_2023_01_016
crossref_primary_10_1016_j_cub_2023_12_001
crossref_primary_10_1109_TAFFC_2020_2981446
crossref_primary_10_1007_s11042_017_5436_9
crossref_primary_10_1109_ACCESS_2017_2752176
crossref_primary_10_1155_2023_7850140
crossref_primary_10_1016_j_patrec_2022_10_020
crossref_primary_10_2174_2352096516666230403091253
crossref_primary_10_1177_0301006618816631
crossref_primary_10_1016_j_displa_2022_102330
crossref_primary_10_1111_pcn_12799
crossref_primary_10_1371_journal_pone_0134790
crossref_primary_10_3390_app12115493
crossref_primary_10_1080_2331186X_2024_2378271
crossref_primary_10_1007_s11042_023_15268_2
crossref_primary_10_2139_ssrn_3060047
crossref_primary_10_1016_j_specom_2022_03_002
crossref_primary_10_1007_s11263_017_0989_7
crossref_primary_10_1007_s10919_023_00426_9
crossref_primary_10_1038_s41467_024_53682_8
crossref_primary_10_3758_s13428_017_0996_1
crossref_primary_10_1016_j_cviu_2024_103927
crossref_primary_10_3917_geco1_126_0003
crossref_primary_10_1016_j_patcog_2023_110173
crossref_primary_10_1088_1741_2552_ac6d7d
crossref_primary_10_1016_j_ijleo_2019_01_020
crossref_primary_10_1016_j_patrec_2022_10_018
crossref_primary_10_1109_ACCESS_2020_2986654
crossref_primary_10_37467_revvisual_v9_3531
crossref_primary_10_1109_ACCESS_2023_3325034
crossref_primary_10_1007_s42761_023_00195_0
crossref_primary_10_1145_3477605
crossref_primary_10_1016_j_cortex_2016_01_001
crossref_primary_10_1109_ACCESS_2019_2921914
crossref_primary_10_3390_su10040973
crossref_primary_10_1109_TAFFC_2020_3014171
crossref_primary_10_1155_2022_9037010
crossref_primary_10_1016_j_knosys_2023_110451
crossref_primary_10_1080_13682199_2023_2172526
crossref_primary_10_1021_acsami_5c01936
crossref_primary_10_1177_1747021817740275
crossref_primary_10_1109_TIP_2023_3293775
crossref_primary_10_1016_j_chb_2017_07_040
crossref_primary_10_1109_JPROC_2023_3275192
crossref_primary_10_1016_j_ijhcs_2015_05_010
crossref_primary_10_3390_s19163465
crossref_primary_10_1109_TIP_2022_3186536
crossref_primary_10_3389_fnbeh_2022_951974
crossref_primary_10_1145_3161414
crossref_primary_10_1002_aur_3145
crossref_primary_10_1016_j_procs_2022_09_493
crossref_primary_10_1016_j_jveb_2019_07_007
crossref_primary_10_1038_s41598_024_62423_2
crossref_primary_10_1049_iet_ipr_2018_5728
crossref_primary_10_3390_s20174847
crossref_primary_10_1007_s00530_024_01552_0
crossref_primary_10_3390_s20174727
crossref_primary_10_1109_ACCESS_2018_2831927
crossref_primary_10_1109_TAFFC_2021_3096922
crossref_primary_10_1007_s42979_024_02792_7
crossref_primary_10_1177_02646196211070927
crossref_primary_10_1007_s00521_022_07157_w
crossref_primary_10_1016_j_jvcir_2021_103395
crossref_primary_10_1007_s11042_020_09373_9
crossref_primary_10_1177_1745691617693393
crossref_primary_10_1016_j_tics_2020_06_006
crossref_primary_10_1523_JNEUROSCI_1375_17_2017
crossref_primary_10_1109_TCSVT_2021_3096061
crossref_primary_10_1109_TIM_2023_3243661
crossref_primary_10_3390_su13168687
crossref_primary_10_1007_s11760_023_02563_6
crossref_primary_10_3390_app9183904
crossref_primary_10_1016_j_joep_2015_08_006
crossref_primary_10_1109_ACCESS_2020_2980893
crossref_primary_10_3390_s22228704
crossref_primary_10_1109_TCYB_2018_2868194
crossref_primary_10_1109_TAFFC_2023_3333874
crossref_primary_10_1111_jan_16063
crossref_primary_10_3389_frai_2024_1386753
crossref_primary_10_1080_02699931_2015_1049124
crossref_primary_10_1016_j_cub_2015_06_009
crossref_primary_10_17759_exppsy_2022150402
crossref_primary_10_3390_sym11101189
crossref_primary_10_1016_j_paid_2020_110350
crossref_primary_10_1109_ACCESS_2023_3286547
Cites_doi 10.1037/0033-2909.115.1.102
10.1016/j.neuron.2013.10.038
10.1023/A:1015469627679
10.1016/j.psychres.2004.07.003
10.1167/9.1.5
10.1016/0042-6989(80)90065-6
10.1038/nrn2920
10.1109/CVPR.2010.5539998
10.1017/S0140525X11000446
10.1109/34.817413
10.1146/annurev.psych.60.110707.163539
10.1093/scan/nsq069
10.1109/TPAMI.2010.28
10.1016/j.patcog.2011.09.023
10.1016/0010-0277(92)90002-Y
10.1037/0033-295X.110.1.145
10.1109/TPAMI.2005.250
10.1016/S0042-6989(03)00079-8
10.1109/34.824823
10.1137/1.9781611970128
10.1198/016214504000000098
10.1080/02699939208411068
10.3758/BRM.42.1.351
10.1109/TPAMI.2002.1008382
10.1016/j.tics.2012.09.006
10.1016/j.jneumeth.2011.06.023
10.1109/JPROC.2006.884093
10.1109/ICCV.2009.5459365
10.1007/978-0-85729-997-0_19
10.7208/chicago/9780226220802.001.0001
10.1109/TPAMI.2005.90
10.1167/11.13.24
10.1109/TPAMI.2010.173
10.1016/j.patcog.2013.06.013
ContentType Journal Article
Copyright Copyright National Academy of Sciences Apr 15, 2014
Copyright_xml – notice: Copyright National Academy of Sciences Apr 15, 2014
DBID FBQ
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QG
7QL
7QP
7QR
7SN
7SS
7T5
7TK
7TM
7TO
7U9
8FD
C1K
FR3
H94
M7N
P64
RC3
7X8
7S9
L.6
5PM
DOI 10.1073/pnas.1322355111
DatabaseName AGRIS
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Animal Behavior Abstracts
Bacteriology Abstracts (Microbiology B)
Calcium & Calcified Tissue Abstracts
Chemoreception Abstracts
Ecology Abstracts
Entomology Abstracts (Full archive)
Immunology Abstracts
Neurosciences Abstracts
Nucleic Acids Abstracts
Oncogenes and Growth Factors Abstracts
Virology and AIDS Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
Engineering Research Database
AIDS and Cancer Research Abstracts
Algology Mycology and Protozoology Abstracts (Microbiology C)
Biotechnology and BioEngineering Abstracts
Genetics Abstracts
MEDLINE - Academic
AGRICOLA
AGRICOLA - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Virology and AIDS Abstracts
Oncogenes and Growth Factors Abstracts
Technology Research Database
Nucleic Acids Abstracts
Ecology Abstracts
Neurosciences Abstracts
Biotechnology and BioEngineering Abstracts
Environmental Sciences and Pollution Management
Entomology Abstracts
Genetics Abstracts
Animal Behavior Abstracts
Bacteriology Abstracts (Microbiology B)
Algology Mycology and Protozoology Abstracts (Microbiology C)
AIDS and Cancer Research Abstracts
Chemoreception Abstracts
Immunology Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
MEDLINE - Academic
AGRICOLA
AGRICOLA - Academic
DatabaseTitleList
Virology and AIDS Abstracts
MEDLINE - Academic

AGRICOLA
CrossRef
MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: FBQ
  name: AGRIS
  url: http://www.fao.org/agris/Centre.asp?Menu_1ID=DB&Menu_2ID=DB1&Language=EN&Content=http://www.fao.org/agris/search?Language=EN
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Sciences (General)
DocumentTitleAlternate Compound facial expressions of emotion
EISSN 1091-6490
EndPage E1462
ExternalDocumentID PMC3992629
3289823301
24706770
10_1073_pnas_1322355111
111_15_E1454
US201600143388
Genre Journal Article
Research Support, N.I.H., Extramural
Feature
GeographicLocations Ohio
GeographicLocations_xml – name: Ohio
GrantInformation_xml – fundername: NEI NIH HHS
  grantid: R01-EY-020834
– fundername: NEI NIH HHS
  grantid: R01 EY020834
– fundername: NIDCD NIH HHS
  grantid: R21 DC011081
– fundername: NIDCD NIH HHS
  grantid: R21-DC-011081
GroupedDBID ---
-DZ
-~X
.55
.GJ
0R~
123
29P
2AX
2FS
2WC
3O-
4.4
53G
5RE
5VS
692
6TJ
79B
85S
AACGO
AAFWJ
AANCE
AAYJJ
ABBHK
ABOCM
ABPLY
ABPPZ
ABPTK
ABTLG
ABZEH
ACGOD
ACIWK
ACKIV
ACNCT
ACPRK
ADULT
ADZLD
AENEX
AEUPB
AEXZC
AFDAS
AFFNX
AFOSN
AFRAH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
AS~
BKOMP
CS3
D0L
DCCCD
DIK
DNJUQ
DOOOF
DU5
DWIUU
E3Z
EBS
EJD
F20
F5P
FBQ
FRP
GX1
HGD
HH5
HQ3
HTVGU
HYE
JAAYA
JBMMH
JENOY
JHFFW
JKQEH
JLS
JLXEF
JPM
JSG
JSODD
JST
KQ8
L7B
LU7
MVM
N9A
NEJ
NHB
N~3
O9-
OK1
P-O
PNE
PQQKQ
R.V
RHF
RHI
RNA
RNS
RPM
RXW
SA0
SJN
TAE
TN5
UKR
VOH
VQA
W8F
WH7
WHG
WOQ
WOW
X7M
XFK
XSW
Y6R
YBH
YKV
YSK
ZA5
ZCA
ZCG
~02
~KM
-
02
0R
1AW
55
AAPBV
ABFLS
ADACO
DZ
H13
KM
PQEST
X
XHC
AAYXX
ABXSQ
ACHIC
ADQXQ
ADXHL
AQVQM
CITATION
IPSME
CGR
CUY
CVF
ECM
EIF
NPM
7QG
7QL
7QP
7QR
7SN
7SS
7T5
7TK
7TM
7TO
7U9
8FD
C1K
FR3
H94
M7N
P64
RC3
7X8
7S9
L.6
5PM
ID FETCH-LOGICAL-c569t-bb8bec6951c5749bf3384450389e888974f951d3510edcf098a50f5867ebd2103
ISSN 0027-8424
1091-6490
IngestDate Thu Aug 21 14:10:51 EDT 2025
Fri Jul 11 05:01:58 EDT 2025
Fri Jul 11 02:39:15 EDT 2025
Mon Jun 30 07:59:59 EDT 2025
Mon Jul 21 06:01:37 EDT 2025
Thu Apr 24 23:06:02 EDT 2025
Tue Jul 01 01:53:04 EDT 2025
Wed Nov 11 00:30:25 EST 2020
Wed Dec 27 18:57:38 EST 2023
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 15
Keywords categorization
face recognition
action units
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c569t-bb8bec6951c5749bf3384450389e888974f951d3510edcf098a50f5867ebd2103
Notes http://dx.doi.org/10.1073/pnas.1322355111
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
Author contributions: A.M.M. designed research; S.D. and Y.T. performed research; S.D. and A.M.M. analyzed data; and S.D. and A.M.M. wrote the paper.
Edited by David J. Heeger, New York University, New York, NY, and approved February 28, 2014 (received for review December 1, 2013)
OpenAccessLink https://www.pnas.org/content/pnas/111/15/E1454.full.pdf
PMID 24706770
PQID 1520012900
PQPubID 42026
ParticipantIDs pubmed_primary_24706770
proquest_miscellaneous_1517398483
pnas_primary_111_15_E1454
crossref_primary_10_1073_pnas_1322355111
proquest_miscellaneous_1803099541
fao_agris_US201600143388
crossref_citationtrail_10_1073_pnas_1322355111
pubmedcentral_primary_oai_pubmedcentral_nih_gov_3992629
proquest_journals_1520012900
ProviderPackageCode RNA
PNE
CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2014-04-15
PublicationDateYYYYMMDD 2014-04-15
PublicationDate_xml – month: 04
  year: 2014
  text: 2014-04-15
  day: 15
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Washington
PublicationSeriesTitle PNAS Plus
PublicationTitle Proceedings of the National Academy of Sciences - PNAS
PublicationTitleAlternate Proc Natl Acad Sci U S A
PublicationYear 2014
Publisher National Academy of Sciences
National Acad Sciences
Publisher_xml – name: National Academy of Sciences
– name: National Acad Sciences
References Ekman P (e_1_3_3_12_2) 1978
Martinez AM (e_1_3_3_5_2) 2012; 13
e_1_3_3_17_2
e_1_3_3_16_2
e_1_3_3_19_2
e_1_3_3_38_2
e_1_3_3_18_2
e_1_3_3_39_2
e_1_3_3_13_2
e_1_3_3_36_2
e_1_3_3_37_2
e_1_3_3_15_2
e_1_3_3_34_2
e_1_3_3_14_2
e_1_3_3_35_2
e_1_3_3_32_2
e_1_3_3_11_2
e_1_3_3_30_2
e_1_3_3_10_2
e_1_3_3_31_2
e_1_3_3_40_2
Borod JC (e_1_3_3_4_2) 2000
cr-split#-e_1_3_3_3_2.2
e_1_3_3_6_2
Ekman P (e_1_3_3_7_2) 1976
e_1_3_3_8_2
e_1_3_3_28_2
e_1_3_3_9_2
e_1_3_3_27_2
cr-split#-e_1_3_3_3_2.1
e_1_3_3_29_2
e_1_3_3_24_2
e_1_3_3_23_2
e_1_3_3_26_2
Vapnik V (e_1_3_3_44_2) 1999
e_1_3_3_45_2
e_1_3_3_25_2
e_1_3_3_2_2
e_1_3_3_20_2
e_1_3_3_43_2
e_1_3_3_1_2
e_1_3_3_22_2
e_1_3_3_41_2
e_1_3_3_21_2
Fisher RA (e_1_3_3_33_2) 1938; 8
e_1_3_3_42_2
7467139 - Vision Res. 1980;20(10):847-56
22617651 - Behav Brain Sci. 2012 Jun;35(3):121-43
12676247 - Vision Res. 2003 Apr;43(9):1047-60
16358412 - IEEE Trans Pattern Anal Mach Intell. 2005 Dec;27(12):1934-44
19271875 - J Vis. 2009;9(1):5.1-11
20160315 - Behav Res Methods. 2010 Feb;42(1):351-62
23950695 - J Mach Learn Res. 2012 May 1;13:1589-1608
20959860 - Nat Rev Neurosci. 2010 Nov;11(11):773-83
20847391 - IEEE Trans Pattern Anal Mach Intell. 2010 Nov;32(11):2022-38
8202574 - Psychol Bull. 1994 Jan;115(1):102-41
18729725 - Annu Rev Psychol. 2009;60:1-25
20820072 - IEEE Trans Pattern Anal Mach Intell. 2011 Mar;33(3):631-8
24187386 - Pattern Recognit. 2014 Jan 1;47(1):null
12529060 - Psychol Rev. 2003 Jan;110(1):145-72
22308002 - Pattern Recognit. 2012 Apr;45(4):1792-1801
15875802 - IEEE Trans Pattern Anal Mach Intell. 2005 May;27(5):812-6
22131445 - J Vis. 2011;11(13):24
23047070 - Trends Cogn Sci. 2012 Nov;16(11):559-72
15541780 - Psychiatry Res. 2004 Oct 30;128(3):235-44
20650943 - Soc Cogn Affect Neurosci. 2011 Apr;6(2):186-94
24183030 - Neuron. 2013 Oct 30;80(3):816-26
1424493 - Cognition. 1992 Sep;44(3):227-40
21741407 - J Neurosci Methods. 2011 Sep 15;200(2):237-56
References_xml – ident: e_1_3_3_2_2
  doi: 10.1037/0033-2909.115.1.102
– ident: e_1_3_3_40_2
  doi: 10.1016/j.neuron.2013.10.038
– volume-title: Pictures of Facial Affect
  year: 1976
  ident: e_1_3_3_7_2
– ident: e_1_3_3_45_2
  doi: 10.1023/A:1015469627679
– ident: e_1_3_3_13_2
  doi: 10.1016/j.psychres.2004.07.003
– ident: e_1_3_3_28_2
  doi: 10.1167/9.1.5
– ident: e_1_3_3_30_2
  doi: 10.1016/0042-6989(80)90065-6
– ident: #cr-split#-e_1_3_3_3_2.2
– ident: e_1_3_3_1_2
– volume-title: The Nature of Statistical Learning Theory
  year: 1999
  ident: e_1_3_3_44_2
– ident: e_1_3_3_29_2
  doi: 10.1038/nrn2920
– ident: e_1_3_3_25_2
  doi: 10.1109/CVPR.2010.5539998
– ident: e_1_3_3_11_2
  doi: 10.1017/S0140525X11000446
– volume: 8
  start-page: 376
  year: 1938
  ident: e_1_3_3_33_2
  article-title: The statistical utilization of multiple measurements
  publication-title: Ann Hum Genet
– ident: e_1_3_3_17_2
– ident: e_1_3_3_31_2
  doi: 10.1109/34.817413
– ident: e_1_3_3_9_2
  doi: 10.1146/annurev.psych.60.110707.163539
– ident: e_1_3_3_15_2
  doi: 10.1093/scan/nsq069
– ident: e_1_3_3_18_2
  doi: 10.1109/TPAMI.2010.28
– volume: 13
  start-page: 1589
  year: 2012
  ident: e_1_3_3_5_2
  article-title: A model of the perception of facial expressions of emotion by humans: Research overview and perspectives
  publication-title: J Mach Learn Res
– ident: e_1_3_3_22_2
  doi: 10.1016/j.patcog.2011.09.023
– ident: e_1_3_3_27_2
  doi: 10.1016/0010-0277(92)90002-Y
– ident: e_1_3_3_8_2
  doi: 10.1037/0033-295X.110.1.145
– ident: e_1_3_3_32_2
  doi: 10.1109/TPAMI.2005.250
– ident: e_1_3_3_26_2
  doi: 10.1016/S0042-6989(03)00079-8
– ident: e_1_3_3_42_2
  doi: 10.1109/34.824823
– ident: #cr-split#-e_1_3_3_3_2.1
– ident: e_1_3_3_35_2
  doi: 10.1137/1.9781611970128
– ident: e_1_3_3_36_2
  doi: 10.1198/016214504000000098
– ident: e_1_3_3_10_2
  doi: 10.1080/02699939208411068
– ident: e_1_3_3_43_2
  doi: 10.3758/BRM.42.1.351
– ident: e_1_3_3_38_2
  doi: 10.1109/TPAMI.2002.1008382
– ident: e_1_3_3_41_2
  doi: 10.1016/j.tics.2012.09.006
– volume-title: Facial Action Coding System: A Technique for the Measurement of Facial Movement
  year: 1978
  ident: e_1_3_3_12_2
– ident: e_1_3_3_14_2
  doi: 10.1016/j.jneumeth.2011.06.023
– ident: e_1_3_3_20_2
  doi: 10.1109/JPROC.2006.884093
– ident: e_1_3_3_21_2
  doi: 10.1109/ICCV.2009.5459365
– ident: e_1_3_3_23_2
  doi: 10.1007/978-0-85729-997-0_19
– volume-title: The Neuropsychology of Emotion
  year: 2000
  ident: e_1_3_3_4_2
– ident: e_1_3_3_24_2
– ident: e_1_3_3_6_2
  doi: 10.7208/chicago/9780226220802.001.0001
– ident: e_1_3_3_39_2
  doi: 10.1109/TPAMI.2005.90
– ident: e_1_3_3_37_2
  doi: 10.1167/11.13.24
– ident: e_1_3_3_34_2
  doi: 10.1109/TPAMI.2010.173
– ident: e_1_3_3_16_2
– ident: e_1_3_3_19_2
  doi: 10.1016/j.patcog.2013.06.013
– reference: 24187386 - Pattern Recognit. 2014 Jan 1;47(1):null
– reference: 1424493 - Cognition. 1992 Sep;44(3):227-40
– reference: 8202574 - Psychol Bull. 1994 Jan;115(1):102-41
– reference: 20160315 - Behav Res Methods. 2010 Feb;42(1):351-62
– reference: 23047070 - Trends Cogn Sci. 2012 Nov;16(11):559-72
– reference: 7467139 - Vision Res. 1980;20(10):847-56
– reference: 12529060 - Psychol Rev. 2003 Jan;110(1):145-72
– reference: 21741407 - J Neurosci Methods. 2011 Sep 15;200(2):237-56
– reference: 19271875 - J Vis. 2009;9(1):5.1-11
– reference: 20959860 - Nat Rev Neurosci. 2010 Nov;11(11):773-83
– reference: 22308002 - Pattern Recognit. 2012 Apr;45(4):1792-1801
– reference: 24183030 - Neuron. 2013 Oct 30;80(3):816-26
– reference: 23950695 - J Mach Learn Res. 2012 May 1;13:1589-1608
– reference: 20847391 - IEEE Trans Pattern Anal Mach Intell. 2010 Nov;32(11):2022-38
– reference: 15541780 - Psychiatry Res. 2004 Oct 30;128(3):235-44
– reference: 16358412 - IEEE Trans Pattern Anal Mach Intell. 2005 Dec;27(12):1934-44
– reference: 20650943 - Soc Cogn Affect Neurosci. 2011 Apr;6(2):186-94
– reference: 12676247 - Vision Res. 2003 Apr;43(9):1047-60
– reference: 22131445 - J Vis. 2011;11(13):24
– reference: 22617651 - Behav Brain Sci. 2012 Jun;35(3):121-43
– reference: 20820072 - IEEE Trans Pattern Anal Mach Intell. 2011 Mar;33(3):631-8
– reference: 15875802 - IEEE Trans Pattern Anal Mach Intell. 2005 May;27(5):812-6
– reference: 18729725 - Annu Rev Psychol. 2009;60:1-25
SSID ssj0009580
Score 2.628679
Snippet Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as...
Though people regularly recognize many distinct emotions, for the most part, research studies have been limited to six basic categories—happiness, surprise,...
SourceID pubmedcentral
proquest
pubmed
crossref
pnas
fao
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage E1454
SubjectTerms Adult
Biological Sciences
cognition
Cognition & reasoning
Discrimination (Psychology) - physiology
Emotions
Emotions - classification
Emotions - physiology
Face
Facial Expression
Facial Muscles - physiology
fearfulness
Female
Humans
Male
Models, Biological
muscles
Ohio
Photography
Physical Sciences
PNAS Plus
Sensory perception
Title Compound facial expressions of emotion
URI http://www.pnas.org/content/111/15/E1454.abstract
https://www.ncbi.nlm.nih.gov/pubmed/24706770
https://www.proquest.com/docview/1520012900
https://www.proquest.com/docview/1517398483
https://www.proquest.com/docview/1803099541
https://pubmed.ncbi.nlm.nih.gov/PMC3992629
Volume 111
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3db9MwELfYeOEFMb4WNlCQEBqaUvJlx3msoNOERplEKpWnyE5tWmlK0NpKE389d7GdtKOgwUtU2VfX9Z3Pd5fz7wh5E87yRGhRBbHgOkgjSQOuWB5kcax5EkV6pvCC8-cxO5-kn6Z06gpt29slKzmofu68V_I_XIU24Cvekv0HznaDQgN8Bv7CEzgMzzvxGDczlkXCqjkY-VY3Nq3V5LYpU6Jn0_687M6rpcsOGLtw4LC_XGJ3_PI0OL0c96WKP67bYOl8Uc3XvVQVog23fmvsIdiBE5jg9PBKLW5s0NWGF6I2K8VcsLQaEQyKgKWmpudA7WhzatQqTSsvdEMrjqLUIEX_pq9BwWCR4VosB-gWg_HjhtlCxh5_Kc8mFxdlMZoWe-R-DC5B7CIzHcAyN8gTdmoOxilL3t8afssC2dOiQVxbINnlY9xOld2wPYpH5KF1GvyhkYADck_Vj8mBY5J_YrHD3z0hb51I-EYk_A2R8BvtW5F4SiZno-LDeWBLYQQVZfkqkJLDZmNgDlc0S3Opk4SnaYuOqDjn4BRq6JsloGFhrjrMuaChppxlSs7Aq0-ekf26qdUh8YWUFHRtFXOmUrC_JVOM0hnY3ZUIqZYeGbjlKSuLE4_lSq7KNl8hS0pcrLJfT4-cdF_4YSBS_kx6COtdiu9wgJWTrzHCGyLAZMK5R7yWuBsBHNOIlq3oeOTYcaa02w9GRcAwjKKGHnnddYNyxDdeolbNGmmiLMk5_M2_0HB8y5jTFGb33DC7m0ScZoiwCL-QbYlBR4Dg7Ns99WLegrQj4DOL8xd3mNsRedDvvGOyv7peq5dg6q7kq1bGfwGXNaP5
linkProvider National Library of Medicine
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Compound+facial+expressions+of+emotion&rft.jtitle=Proceedings+of+the+National+Academy+of+Sciences+-+PNAS&rft.au=Du%2C+Shichuan&rft.au=Tao%2C+Yong&rft.au=Martinez%2C+Aleix+M&rft.date=2014-04-15&rft.issn=1091-6490&rft.eissn=1091-6490&rft.volume=111&rft.issue=15&rft.spage=E1454&rft_id=info:doi/10.1073%2Fpnas.1322355111&rft.externalDBID=NO_FULL_TEXT
thumbnail_m http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=http%3A%2F%2Fwww.pnas.org%2Fcontent%2F111%2F15.cover.gif
thumbnail_s http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=http%3A%2F%2Fwww.pnas.org%2Fcontent%2F111%2F15.cover.gif