Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of Research
Knowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued fragmentation of the research domain. Together, these features make it hard to remain at the forefront of research or assess the collective evidence pert...
Saved in:
Published in | International journal of social robotics Vol. 14; no. 2; pp. 389 - 411 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Dordrecht
Springer Netherlands
01.03.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
ISSN | 1875-4791 1875-4805 |
DOI | 10.1007/s12369-021-00778-6 |
Cover
Loading…
Abstract | Knowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued fragmentation of the research domain. Together, these features make it hard to remain at the forefront of research or assess the collective evidence pertaining to specific areas, such as the role of emotions in HRI. This systematic review of state-of-the-art research into humans’ recognition and responses to artificial emotions of social robots during HRI encompasses the years 2000–2020. In accordance with a stimulus–organism–response framework, the review advances robotic psychology by revealing current knowledge about (1) the generation of artificial robotic emotions (stimulus), (2) human recognition of robotic artificial emotions (organism), and (3) human responses to robotic emotions (response), as well as (4) other contingencies that affect emotions as moderators. |
---|---|
AbstractList | Knowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued fragmentation of the research domain. Together, these features make it hard to remain at the forefront of research or assess the collective evidence pertaining to specific areas, such as the role of emotions in HRI. This systematic review of state-of-the-art research into humans’ recognition and responses to artificial emotions of social robots during HRI encompasses the years 2000–2020. In accordance with a stimulus–organism–response framework, the review advances robotic psychology by revealing current knowledge about (1) the generation of artificial robotic emotions (stimulus), (2) human recognition of robotic artificial emotions (organism), and (3) human responses to robotic emotions (response), as well as (4) other contingencies that affect emotions as moderators. |
Author | Stock-Homburg, Ruth |
Author_xml | – sequence: 1 givenname: Ruth orcidid: 0000-0002-8576-5883 surname: Stock-Homburg fullname: Stock-Homburg, Ruth email: rsh@bwl.tu-darmstadt.de organization: Professor for Marketing and HR, Technical University Darmstadt |
BookMark | eNp9kMFKAzEQhoNUsFZfwFPA82qSzSa73qRUWygoVQ-eQjY7q1vapCbbQm--g2_ok5h2FcGDuWSG_N8_mf8Y9ayzgNAZJReUEHkZKEtFkRBGk9jKPBEHqE9zmSU8J1nvp5YFPUKnIcxJPCmTUoo-sg9rv4EtdjUeLV3bOBtwY_F4vdT28_1j5krX4oltwWuzf73C9-DDCmK3gYBr75Z4r2oMvg9b8-oW7iX6WcwIfgbtw857BiGW5vUEHdZ6EeD0-x6gp5vR43CcTO9uJ8PraWI4K9qkElkJwIWpipJpkVVpzZguK8aBEh7XBQ5E17rShpfckIKYsqqpkGVZV1mu0wE673xX3r2tIbRq7tbexpGKCc6zQgjJoirvVMa7EDzUyjSt3q3Zet0sFCVqF7DqAlYxYLUPWImIsj_oyjdL7bf_Q2kHhSi2L-B_f_UP9QV-apJ9 |
CitedBy_id | crossref_primary_10_1007_s12369_023_01079_w crossref_primary_10_1109_ACCESS_2024_3434689 crossref_primary_10_3390_s23010432 crossref_primary_10_1007_s10489_024_05954_5 crossref_primary_10_1007_s12369_024_01163_9 crossref_primary_10_1007_s12369_024_01167_5 crossref_primary_10_3389_fpsyg_2023_1201145 crossref_primary_10_3390_robotics13050067 crossref_primary_10_3389_fpsyg_2022_1058322 crossref_primary_10_1145_3707466 crossref_primary_10_1016_j_chbah_2025_100146 crossref_primary_10_3390_biomimetics9090558 crossref_primary_10_3390_biomimetics9090557 crossref_primary_10_1007_s11628_022_00517_5 crossref_primary_10_1093_jcmc_zmae015 crossref_primary_10_3390_app13053284 crossref_primary_10_3389_fcomm_2024_1420312 crossref_primary_10_1007_s12369_024_01177_3 crossref_primary_10_1142_S0218127424501177 crossref_primary_10_3390_app14188164 crossref_primary_10_1016_j_neucom_2023_01_002 crossref_primary_10_1007_s12369_023_01091_0 crossref_primary_10_1038_s41598_024_68165_5 crossref_primary_10_47836_mjmhs_18_s9_11 crossref_primary_10_1016_j_sasc_2024_200172 crossref_primary_10_1016_j_techsoc_2024_102512 crossref_primary_10_1080_19368623_2025_2458602 crossref_primary_10_3389_fnbot_2023_1084000 crossref_primary_10_1038_s41562_023_01705_7 crossref_primary_10_1016_j_procs_2024_06_351 crossref_primary_10_1007_s12369_023_01094_x crossref_primary_10_1109_ACCESS_2024_3434544 crossref_primary_10_1080_02699931_2022_2054781 crossref_primary_10_1016_j_eswa_2023_123070 crossref_primary_10_1038_s41598_025_92172_9 crossref_primary_10_1016_j_csl_2024_101666 crossref_primary_10_3390_s23239369 crossref_primary_10_1021_acsanm_4c06195 crossref_primary_10_1080_13683500_2024_2372001 crossref_primary_10_1109_ACCESS_2022_3149214 crossref_primary_10_1080_10447318_2023_2259710 crossref_primary_10_1016_j_chbah_2024_100049 crossref_primary_10_1007_s12369_024_01198_y crossref_primary_10_1080_10447318_2023_2209977 crossref_primary_10_3390_s24041321 crossref_primary_10_1007_s10489_025_06245_3 crossref_primary_10_3389_frobt_2023_1080157 crossref_primary_10_1038_s41467_023_44673_2 crossref_primary_10_1080_10447318_2023_2295691 crossref_primary_10_3389_fpsyg_2024_1391832 crossref_primary_10_1145_3681782 crossref_primary_10_1007_s00521_024_09426_2 crossref_primary_10_1038_s41598_024_55949_y crossref_primary_10_1177_10596011251313568 crossref_primary_10_1145_3677613 crossref_primary_10_1145_3723050 crossref_primary_10_1007_s12369_023_01058_1 crossref_primary_10_1002_aisy_202400567 |
Cites_doi | 10.1080/01691864.2019.1589570 10.1109/HRI.2019.8673266 10.2307/249008 10.1109/RO-MAN46459.2019.8956367 10.1136/amiajnl-2011-000089 10.1002/9780470987667.ch4 10.1299/jsmec.43.568 10.5772/57313 10.1109/IROS.2012.6385901 10.1109/HRI.2016.7451748 10.1109/ATC.2011.6027478 10.1145/3171221.3171261 10.1109/CVPRW.2019.00035 10.1145/3306618.3314319 10.1016/j.future.2019.09.029 10.1007/978-3-319-62530-0_10 10.1109/ROMAN.2009.5326306 10.1016/j.cviu.2015.07.007 10.1109/IROS.2012.6385941 10.1515/pjbr-2017-0001 10.1109/ROMAN.2006.314448 10.1109/TCE.2009.5278031 10.1007/978-3-030-35888-4_59 10.1109/ROMAN.2011.6005263 10.1007/978-3-030-19591-5_20 10.4135/9781452229249 10.1109/IROS.2001.976329 10.1016/S0148-2963(99)00087-9 10.1016/j.jbusres.2019.07.039 10.1037/gpr0000056 10.1016/j.euroecorev.2008.12.001 10.25300/MISQ/2013/37.1.02 10.1371/journal.pone.0224758 10.1016/S0148-2963(99)00010-7 10.1016/S0921-8890(02)00376-7 10.5772/55607 10.20982/tqmp.04.1.p013 10.1145/3349537.3352764 10.1007/978-3-030-35888-4_41 10.1109/ICIEV.2019.8858529 10.1145/3349537.3351890 10.1007/978-3-030-22643-5_10 10.1109/HRI.2016.7451773 10.1145/3029798.3038319 10.1109/HRI.2010.5453269 10.1089/tmj.2009.0171 10.1109/HRI.2019.8673222 10.1207/s15327051hci1901&2_4 10.1109/CIT.2008.Workshops.85 10.3389/fnbot.2019.00046 10.1109/SII.2019.8700422 10.1109/IROS.2006.282327 10.1098/rstb.2018.0026 10.1007/s11165-004-3437-y 10.1007/3-540-45631-7_17 10.1080/02533839.2012.751330 10.1007/s11370-010-0060-9 10.1109/HRI.2019.8673172 10.1109/ROMAN.2012.6343772 10.1145/1514095.1514110 10.1109/WACI.2011.5953147 10.1109/IIH-MSP.2014.204 10.1109/ROMAN.2005.1513802 10.1145/2909824.3020224 10.1109/CONIELECOMP.2019.8673111 10.1016/j.procs.2015.08.102 10.1007/11573548_122 10.1007/s12369-013-0178-y 10.1109/LRA.2019.2947010 10.1145/1957656.1957818 10.1017/CBO9780511571299 10.1007/s12369-021-00763-2 10.1145/1957656.1957789 10.1115/DSCC2015-9841 10.1007/978-981-13-9406-5_15 10.1109/ICEIEC.2019.8784476 10.1109/ACCESS.2019.2891668 10.1145/2559636.2563702 10.1109/TEXCRA.2004.1424983 10.1007/s12369-022-00867-0 10.1007/s12369-013-0200-4 10.1145/1957656.1957669 10.1136/sbmj.0405184 10.1007/s12369-015-0297-8 10.1016/S0148-2963(97)00122-7 10.1080/09540091.2018.1454889 10.1109/ISCID.2008.170 10.1109/ROMAN.2012.6343883 10.1037/h0077714 10.1145/375735.376103 10.1111/1467-8551.00375 10.1109/TSMC.2019.2897330 10.1126/science.164.3875.86 10.1109/TRO.2010.2062550 10.1109/HRI.2019.8673082 10.23919/SCSE.2019.8842658 10.1109/SMC.2019.8914198 10.1207/s15327957pspr1003_4 10.1109/DASC/PiCom/CBDCom/CyberSciTech.2019.00076 10.1145/1228716.1228734 10.1007/s10458-015-9307-3 10.1109/HRI.2019.8673078 10.1109/PERCOMW.2019.8730714 10.1007/s12369-013-0224-9 10.1109/ICHR.2004.1442120 10.1525/9780520951853 10.1109/FG.2018.00072 10.1109/HRI.2019.8673012 10.1007/s12369-011-0096-9 10.1023/A:1013215010749 10.1109/IROS.2005.1545125 10.1007/s11370-019-00301-x 10.1177/239700221502900101 10.1007/s12369-019-00603-1 10.1007/s12369-017-0439-2 10.21437/Eurospeech.2003-80 10.1007/978-3-030-42307-0_2 10.1002/0470013494.ch3 10.1109/TRO.2007.904904 10.1109/MRA.2012.2192811 10.1109/WCICA.2018.8630711 10.1007/978-3-319-25554-5_19 10.1007/3-540-36460-9_17 10.1007/978-3-319-92537-0_88 10.1007/s12369-013-0201-3 10.1109/ICCCyb.2013.6617620 10.1007/s12369-018-0483-6 10.1007/978-3-319-07230-2_64 10.1145/2696454.2696495 10.1016/S1071-5819(03)00018-1 10.1163/156855394X00158 10.1007/3-211-38927-X_33 10.1007/978-1-4614-8280-2_18 10.1515/pjbr-2018-0012 10.1109/ICRA.2019.8793720 10.1007/s12369-016-0389-0 10.1002/ejsp.2420010202 10.1007/s10514-015-9444-1 10.4324/9781315660998 10.1080/02699930244000444 10.1007/978-3-319-16199-0_30 10.1109/VETECF.2009.5378965 10.1109/ROMAN.2005.1513860 10.1108/JOSM-04-2018-0119 10.1109/IRIS.2015.7451614 10.1145/3173386.3177063 10.1515/pjbr-2019-0028 10.1037/1528-3542.3.1.48 10.5057/kei.5.3_35 10.1515/jaiscr-2015-0018 10.1145/2157689.2157814 10.3390/s19132844 10.1080/02699938708408361 10.1016/j.chb.2018.03.043 10.1145/1877826.1877837 10.1145/1349822.1349856 10.1016/j.chb.2017.06.036 10.1109/HUMANOIDS.2012.6651536 10.1007/978-3-642-03983-6_10 10.1109/ICCES45898.2019.9002175 10.1109/JAS.2017.7510622 10.1109/ROMAN.2009.5326184 10.1109/ETCM.2017.8247472 10.1007/978-3-030-30036-4_26 10.24251/HICSS.2018.133 10.1177/1729881418817972 10.1006/obhd.1994.1083 10.1111/j.2044-8295.1996.tb02615.x 10.1145/2696454.2696481 10.1145/1514095.1514106 10.1145/782896.782911 10.1037/0021-9010.88.5.879 10.1109/ROMAN.2010.5598649 10.1145/2909824.3020239 10.24251/HICSS.2018.559 10.1145/1514095.1514127 10.1109/ARSO.2010.5679999 10.24251/HICSS.2019.234 10.1016/0030-5073(82)90254-9 10.1109/SMC.2019.8914039 10.31234/osf.io/ubq34 10.1016/j.engappai.2019.103356 10.25300/MISQ/2014/38.3.05 10.1109/ROMAN.2009.5326282 10.1007/s12369-019-00524-z 10.1037/t27734-000 10.1037/10261-000 10.1109/MNET.001.1900070 10.1007/s12369-008-0009-8 10.1177/001872675400700202 10.1145/267658.267688 10.1007/s12369-014-0237-z 10.1037/a0032947 10.1111/jpim.12112 10.20965/jaciii.2010.p0852 10.1037/1089-2680.1.3.311 10.1093/acprof:oso/9780195166194.003.0010 10.1007/978-3-319-11973-1_17 10.1016/j.chb.2016.02.043 10.1145/2559636.2559660 10.1007/978-3-030-23522-2_38 10.1007/s12369-013-0193-z 10.1017/S0021932000023336 10.1257/jep.21.2.153 10.1177/1094428107308978 10.1109/HRI.2013.6483606 10.1007/s12369-017-0446-3 10.1109/ACII.2019.8925459 10.1145/2559636.2559663 10.1007/978-3-030-36150-1_49 10.4324/9781315778006 10.1007/s10846-010-9418-7 10.1109/ROMAN.2007.4415195 10.5772/50228 10.1109/TCSS.2019.2922593 10.3390/s131115549 10.1145/3029798.3038333 10.1016/j.eswa.2013.03.016 10.1177/0092070305276368 10.1145/2157689.2157764 10.14257/ijca.2018.11.5.11 10.1007/s10846-015-0259-2 10.1145/1121241.1121281 10.1177/1541931214581420 10.1080/00220973.1987.10806451 10.1016/j.ins.2019.09.005 10.1007/s12369-016-0387-2 10.1073/pnas.0707769104 10.1109/TSMCB.2008.920227 10.1111/1467-8721.ep10770953 10.1145/2909824.3020216 10.1075/is.7.3.03mac 10.1109/PERCOMW.2017.7917585 10.1016/j.robot.2009.09.015 10.1109/SII.2019.8700376 10.1007/978-3-030-19591-5_18 10.1109/SCIS-ISIS.2018.00044 10.1109/TMECH.2008.2008644 10.1007/s12369-014-0248-9 10.1109/ROMAN.2004.1374726 10.1007/s12369-017-0441-8 10.1093/cje/26.2.261 10.1109/ACCESS.2019.2907327 10.1093/beheco/ark016 10.1093/scan/nsq019 10.1109/ROMAN.2004.1374728 10.1109/HICSS.2016.273 10.1037/0022-3514.50.5.992 10.21437/Interspeech.2011-781 10.1109/ICHR.2006.321363 10.1109/SICE.2006.315537 10.1109/CogInfoCom.2012.6421937 |
ContentType | Journal Article |
Copyright | The Author(s) 2021 The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: The Author(s) 2021 – notice: The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | C6C AAYXX CITATION 8FE 8FG ABJCF AFKRA ARAPS BENPR BGLVJ CCPQU DWQXO HCIFZ L6V M7S P5Z P62 PHGZM PHGZT PKEHL PQEST PQGLB PQQKQ PQUKI PTHSS |
DOI | 10.1007/s12369-021-00778-6 |
DatabaseName | Springer Nature OA Free Journals CrossRef ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Technology Collection ProQuest One ProQuest Central SciTech Premium Collection ProQuest Engineering Collection Engineering Database Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition Engineering Collection |
DatabaseTitle | CrossRef Advanced Technologies & Aerospace Collection Engineering Database Technology Collection ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest One Academic Eastern Edition SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central Advanced Technologies & Aerospace Database ProQuest One Applied & Life Sciences ProQuest Engineering Collection ProQuest One Academic UKI Edition ProQuest Central Korea Materials Science & Engineering Collection ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) Engineering Collection |
DatabaseTitleList | CrossRef Advanced Technologies & Aerospace Collection |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Psychology |
EISSN | 1875-4805 |
EndPage | 411 |
ExternalDocumentID | 10_1007_s12369_021_00778_6 |
GrantInformation_xml | – fundername: Deutsche Forschungsgemeinschaft funderid: http://dx.doi.org/10.13039/501100001659 |
GroupedDBID | -5B -5G -BR -EM -~C .DC .VR 06D 0R~ 0VY 1N0 203 29J 29~ 2J2 2JN 2JY 2KG 2KM 2LR 2VQ 2~H 30V 4.4 406 408 40D 5GY 6NX 8TC 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABDZT ABECU ABFTD ABFTV ABHLI ABHQN ABIVO ABJCF ABJNI ABJOX ABKCH ABKTR ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABWNU ABXPI ACAOD ACCUX ACDTI ACGFS ACHSB ACHXU ACIWK ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACZOJ ADHHG ADHIR ADINQ ADKNI ADKPE ADMLS ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFQL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFGCZ AFKRA AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHSBF AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARAPS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN B-. BDATZ BENPR BGLVJ BGNMA BSONS C6C CAG CCPQU COF CSCUP DDRTE DNIVK DPUIP EBLON EBS EIOEI EJD ESBYG FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNWQR GQ6 GQ7 H13 HCIFZ HF~ HG6 HLICF HMJXF HRMNR HVGLF HZ~ IJ- IKXTQ IWAJR IXD I~X J-C J0Z JBSCW JZLTJ KOV LLZTM M4Y M7S N2Q NPVJJ NQJWS NU0 O9- O93 O9J P9P PF0 PT4 PTHSS QOS R89 R9I ROL RSV S1Z S27 S3B SAP SDH SEG SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 TSG TUC U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 Z7X Z7Z Z83 Z88 ZMTXR ~A9 AAPKM AAYXX ABBRH ABDBE ABFSG ACSTC ADHKG AEZWR AFDZB AFHIU AFOHR AGQPQ AHPBZ AHWEU AIXLP ATHPR AYFIA CITATION PHGZM PHGZT 8FE 8FG ABRTQ DWQXO L6V P62 PKEHL PQEST PQGLB PQQKQ PQUKI |
ID | FETCH-LOGICAL-c429t-d65bee46cd9b2a65d3f22abd24e104123e4e0afadac4b4c090cbdf167bbfd58a3 |
IEDL.DBID | BENPR |
ISSN | 1875-4791 |
IngestDate | Fri Jul 25 11:04:08 EDT 2025 Tue Jul 01 04:31:08 EDT 2025 Thu Apr 24 22:52:47 EDT 2025 Fri Feb 21 02:47:41 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 2 |
Keywords | Artificial emotions Survey Review Human–robot interaction (HRI) Social robots Emotions |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c429t-d65bee46cd9b2a65d3f22abd24e104123e4e0afadac4b4c090cbdf167bbfd58a3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-8576-5883 |
OpenAccessLink | https://link.springer.com/10.1007/s12369-021-00778-6 |
PQID | 2644596672 |
PQPubID | 2043899 |
PageCount | 23 |
ParticipantIDs | proquest_journals_2644596672 crossref_citationtrail_10_1007_s12369_021_00778_6 crossref_primary_10_1007_s12369_021_00778_6 springer_journals_10_1007_s12369_021_00778_6 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-03-01 |
PublicationDateYYYYMMDD | 2022-03-01 |
PublicationDate_xml | – month: 03 year: 2022 text: 2022-03-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Dordrecht |
PublicationPlace_xml | – name: Dordrecht |
PublicationTitle | International journal of social robotics |
PublicationTitleAbbrev | Int J of Soc Robotics |
PublicationYear | 2022 |
Publisher | Springer Netherlands Springer Nature B.V |
Publisher_xml | – name: Springer Netherlands – name: Springer Nature B.V |
References | EysenckMWKeaneMTCognitive psychology: a student’s handbook2015PhiladelphiaPsychology Press Shao M, Alves SFDR, Ismail O, Zhang X, Nejat G, Benhabib, B(2019) You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, pp 3811–3817 TurleyLWMillimanREAtmospheric effects on shopping behavior: a review of the experimental evidenceJ Bus Res2000492193211 Hochschild AR (2012) The managed heart: commercialization of human feeling. Univ of California Press MoriMThe uncanny valleyEnergy1970743335 Kwon OW, Chan K, Hao J, Lee TW (2003) Emotion recognition by speech signals. In: Eurospeech, Geneva, pp 125–128 DonaldsonLThe contingency theory of organizations2001LondonSage Keshari T, Palaniswamy S (2019) Emotion recognition using feature-level fusion of facial expressions and body gestures. In: 2019 international conference on communication and electronics systems (ICCES). IEEE, pp 1184–1189 LittlejohnSWFossKATheories of human communication2010Long GroveWaveland Press Marmpena M, Lim A, Dahl TS, Hemion N (2019) Generating robotic emotional body language with variational autoencoders. In: 2019 8th international conference on affective computing and intelligent interaction (ACII). IEEE, pp 545–551 Hegel, F, Eyssel F, Wrede B (2010) The social robot ‘flobi’: key concepts of industrial design. In: 19th international symposium in robot and human interactive communication. IEEE, pp 107–112 StockRMHow should customers be integrated for effective interorganizational NPD teams? An input-process-output perspectiveJ Prod Innov Manag20143135355513297227 XinLLunXZhi-liangWDong-meiFRobot emotion and performance regulation based on HMMInt J Adv Rob Syst2013103160 Park CH, Javed H, Jeon M (2019) Consensus-based human–agent interaction model for emotion regulation in ASD. In: International conference on human–computer interaction. Springer, pp 295–301 KennyDAReflections on mediationOrgan Res Methods20081123533582525831 Bartneck, C (2003) Interacting with an embodied emotional character. In: Proceedings of the 2003 International Conference on Designing Pleasurable Products and Interfaces, pp. 55–60 BeckACañameroLHiolleADamianoLCosiPTesserFSommavillaGInterpretation of emotional body language displayed by a humanoid robot: a case study with childrenInt J Soc Robot201353325334 SpekmanMLKonijnEAHoornJFPerceptions of healthcare robots as a function of emotion-based coping: the importance of coping appraisals and coping strategiesComput Hum Behav201885308318 Gonsior B, Sosnowski S, Buß M, Wollherr D, Kühnlenz K(2015) An emotional adaption approach to increase helpfulness towards a robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 2429–2436 YamashitaYIshiharaHIkedaTAsadaMInvestigation of causal relationship between touch sensations of robots and personality impressions by path analysisInt J Soc Robot2019111141150 AnzaiYHuman–robot–computer interaction: a new paradigm of research in roboticsAdv Robot199384357369 Huang JY, Lee WP, Dong BW (2019) Learning emotion recognition and response generation for a service robot. In: IFToMM international symposium on robotics and mechatronics. Springer, pp 286–297 Le TL, Dong VT(2011) Toward a vietnamese facial expression recognition system for human–robot interaction. In: The 2011 international conference on advanced technologies for communications (ATC 2011). IEEE, pp 252–255 EkmanPEmotions revealedBMJ2004328Suppl S50405184 SongKTHanMJWangSCSpeech signal-based emotion recognition and its application to entertainment robotsJ Chin Inst Eng20143711425 NacharNThe Mann–Whitney u: a test for assessing whether two independent samples come from the same distributionTutor Quant Methods Psychol2008411320 TrovatoGRamosJGAzevedoHMoroniAMagossiSSimmonsRIshiiHTakanishiAA receptionist robot for Brazilian people: study on interaction involving illiteratesPaladyn J Behav Roboti201781117 Tsuchiya S, Imono M, Watabe H (2015) Judging emotion from EEGS, Procedia computer science, 60, pp 37–44 Kitagawa Y, Ishikura T, Song W, Mae Y, Minami M, Tanaka K (2009) Human-like patient robot with chaotic emotion for injection training. In: 2009 ICCAS-SICE. IEEE, pp 4635–4640 Shi Y, Chen Y, Ardila LR, Venture G, Bourguet, ML (2019) A visual sensing platform for robot teachers. In: Proceedings of the 7th international conference on human–agent interaction, pp 200–201 HeerinkMKröseBEversVWielingaBThe influence of social presence on acceptance of a companion robot by older peopleJ Phys Agents2008223340 Duncan S, Fiske DW (2015) Face-to-face interaction: research, methods, and theory. Routledge DengJPangGZhangZPangZYangHYangGcGAN based facial expression recognition for human–robot interactionIEEE Access2019798489859 Wang W, Athanasopoulos G, Patsis G, Enescu V, Sahli H (2014) Real-time emotion recognition from natural bodily expressions in child–robot interaction. In: European conference on computer vision. Springer, pp 424–435 TeradaKTakeuchiCEmotional expression in simple line drawings of a robot’s face leads to higher offers in the ultimatum gameFront Psychol2017872419 EkmanPSorensonERFriesenWVPan-cultural elements in facial displays of emotionScience196916438758688 Lehmann H, Broz F (2018) Contagious yawning in human–robot interaction. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction, pp 173–174 Itoh K, Miwa H, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A(2006) Mechanical design of emotion expression humanoid robot WE-4RII. In: Romansy 16. Springer, pp 255–262 GhazaliASHamJBarakovaEMarkopoulosPAssessing the effect of persuasive robots interactive social cues on users’ psychological reactance, liking, trusting beliefs and complianceAdv Robot2019337–8325337 Gockley R, Simmons R, Forlizzi J (2006) Modeling affect in socially interactive robots. In: ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication. IEEE, pp 558–563 Michaud F, Robichaud E, Audet J (2001) Using motives and artificial emotions for prolonged activity of a group of autonomous robots. In: Proceedings of the AAAI fall symposium on emotions. Cape Code Massachussetts DubalSFoucherAJouventRNadelJHuman brain spots emotion in non humanoid robotsSoc Cognit Affect Neurosci2011619097 MacDormanKFIshiguroHThe uncanny advantage of using androids in cognitive and social science researchInteract Stud200673297337 DevillersLTahonMSehiliMADelabordeAInference of human beings’ emotional states from speech in human–robot interactionsInt J Soc Robot201574451463 FestingerLA theory of social comparison processesHum Relat195472117140 TsiourtiCWeissAWacKVinczeMMultimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robotsInt J Soc Robot2019114555573 ErolBAMajumdarABenavidezPRadPChooKKRJamshidiMToward artificial emotional intelligence for cooperative social human–machine interactionIEEE Trans Comput Soc Syst201971234246 Ghazali AS, Ham J, Markopoulos P, Barakova EI (2019a) Investigating the effect of social cues on social agency judgement. In: HRI, pp 586–587 Hyun KH, Kim EH, Kwak YK (2007) Emotional feature extraction based on phoneme information for speech emotion recognition. In: RO-MAN 2007-The 16th IEEE international symposium on robot and human interactive communication. IEEE, pp 802–806 ChenLSuWFengYWuMSheJHirotaKTwo-layer fuzzy multiple random forest for speech emotion recognition in human–robot interactionInf Sci2020509150163 Koschate M, Potter R, Bremner P, Levine M (2016) Overcoming the uncanny valley: displays of emotions reduce the uncanniness of humanlike robots. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 359–366 StaffordRQMacDonaldBALiXBroadbentEOlder people’s prior robot attitudes influence evaluations of a conversational robotInt J Soc Robot201462281297 Castillo JC, Castro-González Á, Alonso-Martín F, Fernández-Caballero A, Salichs MÁ (2018) Emotion detection and regulation from personal assistant robot in smart environment. In: Personal assistants: emerging computational technologies. Springer, pp 179–195 BennettCCŠabanovićSDeriving minimal features for human-like facial expressions in robotic facesInt J Soc Robot201463367381 TosiHLJrSlocumJWJrContingency theory: some suggested directionsJ Manag1984101926 Dodd W, Gutierrez R (2005) The role of episodic memory and emotion in a cognitive robot. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 692–697 GunesHCeliktutanOSariyanidiELive human-robot interactive public demonstrations with automatic emotion and personality predictionPhilos Trans R Soc B2019374177118 TajfelHHuman groups and social categories: studies in social psychology1981CambridgeCambridge University Press BreazealCAryanandaLRecognition of affective communicative intent in robot-directed speechAuton Robot2002121831041012.68638 Kim EH, Kwak SS, Kwak YK (2009) Can robotic emotional expressions induce a human to empathize with a robot? In: RO-MAN 2009—the 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 358–362 Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO (2012) Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 1019–1025 Tschöpe N, Reiser JE, Oehl M (2017) Exploring the uncanny valley effect in social robotics. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction, pp 307–308 KimHRKwonDSComputational model of emotion generation for human–robot interaction based on the cognitive appraisal theoryJ Intell Robot Syst20106022632831203.68243 VenkateshVBrownSABalaHBridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systemsMIS Q20 H Leventhal (778_CR162) 1987; 1 BPEA Vásquez (778_CR271) 2020; 88 RL Oliver (778_CR197) 1994; 60 LW Turley (778_CR267) 2000; 49 C Breazeal (778_CR29) 2003; 59 R Donovan (778_CR66) 1982; 58 S You (778_CR291) 2018; 2 778_CR50 778_CR181 778_CR182 778_CR52 778_CR183 S Wittig (778_CR278) 2016; 11 778_CR55 778_CR180 778_CR57 778_CR186 778_CR187 778_CR188 LF Thompson (778_CR257) 2016 L Zhang (778_CR297) 2013; 40 P Ekman (778_CR74) 1969; 164 T Law (778_CR153) 2020; 86 L Devillers (778_CR62) 2015; 7 HL Tosi Jr (778_CR260) 1984; 10 778_CR61 778_CR170 EA Butler (778_CR37) 2003; 3 T Kanda (778_CR127) 2004; 19 778_CR171 778_CR292 778_CR63 778_CR293 778_CR173 778_CR64 S Nishio (778_CR192) 2013; 5 778_CR290 778_CR69 A Garrell (778_CR89) 2017; 9 778_CR295 778_CR175 778_CR296 778_CR176 H Leventhal (778_CR160) 1979 D Tranfield (778_CR261) 2003; 14 M Mori (778_CR184) 2012; 19 F Guala (778_CR100) 2002; 26 SD Levitt (778_CR164) 2009; 53 L Chen (778_CR48) 2020; 509 T Ogata (778_CR196) 2000; 43 778_CR280 778_CR281 778_CR283 H Snyder (778_CR227) 2019; 104 F Tanaka (778_CR252) 2007; 104 HJ Arnold (778_CR10) 1982; 29 778_CR167 778_CR35 M Kanoh (778_CR131) 2005; 5 RF Baumeister (778_CR15) 1997; 1 778_CR39 778_CR165 PM Podsakoff (778_CR206) 2003; 88 Z Liu (778_CR169) 2017; 4 RT White (778_CR276) 2005; 35 778_CR149 SS Tomkins (778_CR259) 1984; 163 S Dubal (778_CR68) 2011; 6 H Leventhal (778_CR161) 1980 778_CR270 778_CR41 778_CR150 778_CR40 778_CR151 778_CR43 778_CR42 T Kanda (778_CR130) 2010; 26 778_CR45 RM Stock (778_CR237) 2005; 33 778_CR44 778_CR47 778_CR156 778_CR46 778_CR157 778_CR49 778_CR279 778_CR159 D Cameron (778_CR38) 2018; 30 778_CR273 778_CR154 778_CR275 778_CR155 C Breazeal (778_CR31) 2005 PM Podsakoff (778_CR207) 1986; 12 C Goulart (778_CR96) 2019; 19 J Wirtz (778_CR277) 2018; 29 L Bishop (778_CR28) 2019; 10 L Donaldson (778_CR65) 2001 M Gácsi (778_CR86) 2016; 59 H Gunes (778_CR101) 2019; 374 I Leite (778_CR158) 2013; 5 JG Rázuri (778_CR212) 2015; 4 H Kozima (778_CR147) 2009; 1 778_CR11 778_CR14 SW Littlejohn (778_CR168) 2010 X Zheng (778_CR301) 2019; 5 778_CR13 778_CR300 778_CR16 778_CR302 778_CR18 JM Kory-Westlund (778_CR145) 2019; 6 778_CR19 778_CR1 C Yoo (778_CR289) 1998; 42 778_CR5 778_CR2 778_CR3 E Broadbent (778_CR32) 2010; 16 778_CR21 778_CR23 778_CR24 778_CR27 CM de Melo (778_CR179) 2019; 14 778_CR26 D McColl (778_CR177) 2016; 82 778_CR8 J Xu (778_CR284) 2015; 29 B Parkinson (778_CR204) 1996; 87 H Tajfel (778_CR249) 1971; 1 JA Russell (778_CR216) 1980; 39 S Wang (778_CR274) 2015; 19 M Żarkowski (778_CR294) 2019; 11 Z Yan (778_CR287) 2013; 10 Y Su (778_CR240) 2019; 13 J Deng (778_CR60) 2019; 7 SA Eroglu (778_CR77) 2001; 54 778_CR193 K Terada (778_CR253) 2017; 8 DA Kenny (778_CR133) 2008; 11 778_CR194 778_CR195 778_CR190 M Marmpena (778_CR174) 2018; 9 778_CR191 Q Xu (778_CR285) 2015; 7 AS Ghazali (778_CR91) 2019; 33 P Ekman (778_CR71) 2004; 328 778_CR199 778_CR219 MW Eysenck (778_CR80) 2015 MS Erden (778_CR76) 2013; 5 778_CR215 S Folkman (778_CR83) 1986; 50 C Becker-Asano (778_CR20) 2011; 1 778_CR218 J Yang (778_CR288) 2020; 102 D Dörner (778_CR67) 2013; 17 RJ Larsen (778_CR152) 1992 R Kirby (778_CR142) 2010; 58 CC Bennett (778_CR22) 2014; 6 JH Kim (778_CR140) 2019; 7 ME Reyes (778_CR214) 2019; 16 778_CR102 778_CR223 778_CR103 778_CR224 778_CR104 E Hatfield (778_CR108) 1993; 2 778_CR225 778_CR220 M Dziergwa (778_CR70) 2018; 10 778_CR221 H Tajfel (778_CR247) 1981 JS Park (778_CR203) 2009; 55 778_CR208 778_CR209 KT Song (778_CR228) 2014; 37 L Bueno (778_CR36) 2008 S Demoulin (778_CR59) 2004; 18 A Ortony (778_CR198) 1988 R Andreasson (778_CR7) 2018; 10 T Zhang (778_CR299) 2010; 3 778_CR211 778_CR213 GD Ruxton (778_CR217) 2006; 17 778_CR210 BA Erol (778_CR78) 2019; 7 JA Claret (778_CR53) 2017; 9 C Tsiourti (778_CR265) 2019; 11 C Galindo (778_CR87) 2008; 38 E Broadbent (778_CR33) 2011; 3 Y Li (778_CR166) 2019; 33 778_CR200 778_CR201 N Nachar (778_CR189) 2008; 4 G Trovato (778_CR263) 2017; 8 L Zhang (778_CR298) 2015; 140 C Breazeal (778_CR30) 2002; 12 E Park (778_CR202) 2012; 9 S Chung (778_CR51) 2018; 11 SA Brown (778_CR34) 2014; 38 R Taki (778_CR251) 2010; 4 Z Thimmesch-Gill (778_CR256) 2017; 76 RQ Stafford (778_CR231) 2014; 6 DA Ghani (778_CR90) 2012; 37 K Goddard (778_CR94) 2012; 19 T Kanda (778_CR128) 2007; 23 L Festinger (778_CR81) 1954; 7 778_CR139 A Beck (778_CR17) 2013; 5 M Mori (778_CR185) 1970; 7 HR Kim (778_CR138) 2010; 60 778_CR92 FD Davis (778_CR58) 1989; 13 G Bieling (778_CR25) 2015; 29 778_CR93 778_CR95 778_CR98 KF MacDorman (778_CR172) 2006; 7 778_CR97 778_CR99 778_CR266 778_CR146 778_CR268 778_CR148 778_CR269 778_CR141 778_CR262 778_CR143 778_CR264 778_CR144 SD Levitt (778_CR163) 2007; 21 Y Anzai (778_CR9) 1993; 8 778_CR129 F Alonso-Martin (778_CR4) 2013; 13 778_CR250 RP Bagozzi (778_CR12) 1986 SH Seo (778_CR222) 2018; 10 778_CR134 778_CR255 778_CR136 R Siegwart (778_CR226) 2003; 42 Y Yamashita (778_CR286) 2019; 11 778_CR137 778_CR258 778_CR132 778_CR254 A Aly (778_CR6) 2016; 40 N Haslam (778_CR107) 2006; 10 778_CR116 778_CR117 778_CR238 778_CR118 778_CR239 778_CR119 K Dautenhahn (778_CR56) 2007; 4 778_CR72 F Jimenez (778_CR125) 2015; 5 778_CR73 778_CR75 778_CR123 778_CR244 778_CR79 778_CR124 778_CR245 778_CR126 E Dandıl (778_CR54) 2019; 2 778_CR120 A Mehrabian (778_CR178) 1974 778_CR241 778_CR121 778_CR242 L Xin (778_CR282) 2013; 10 778_CR122 778_CR243 EH Kim (778_CR135) 2009; 14 DW Zimmerman (778_CR303) 1987; 55 778_CR105 778_CR106 778_CR229 ML Spekman (778_CR230) 2018; 85 RM Stock (778_CR235) 2014; 31 778_CR82 RE Plutchik (778_CR205) 1997 H Tajfel (778_CR246) 1969; 1 778_CR85 M Heerink (778_CR109) 2008; 2 778_CR84 778_CR88 H Tajfel (778_CR248) 1982 778_CR112 778_CR233 778_CR113 778_CR234 778_CR114 778_CR115 778_CR236 V Venkatesh (778_CR272) 2013; 37 778_CR110 778_CR111 778_CR232 |
References_xml | – reference: KozimaHMichalowskiMPNakagawaCKeeponInt J Soc Robot200911318 – reference: ClaretJAVentureGBasañezLExploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority taskInt J Soc Robot201792277292 – reference: Hoffman G, Zuckerman O, Hirschberger G, Luria M, Shani-Sherman T (2015) Design and evaluation of a peripheral robotic conversation companion. In: 2015 10th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 3–10 – reference: Murray JC, Cañamero L, Bard KA, Ross MD, Thorsteinsson K (2009) The influence of social interaction on the perception of emotional expression: a case study with a robot head. In: FIRA RoboWorld Congress. Springer, pp 63–72 – reference: Keshari T, Palaniswamy S (2019) Emotion recognition using feature-level fusion of facial expressions and body gestures. In: 2019 international conference on communication and electronics systems (ICCES). IEEE, pp 1184–1189 – reference: Yoon Y, Ko WR, Jang M, Lee J, Kim J, Lee G (2019) Robots learn social skills: end-to-end learning of co-speech gesture generation for humanoid robots. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 4303–4309 – reference: YouSRobertLTeaming up with robots: an IMOI (inputs-mediators-outputs-inputs) framework of human–robot teamworkInt J Robot Eng (IJRE)20182317 – reference: ZhangLMistryKJiangMNeohSCHossainMAAdaptive facial point detection and emotion recognition for a humanoid robotComput Vis Image Underst201514093114 – reference: DonaldsonLThe contingency theory of organizations2001LondonSage – reference: Miwa H, Takanishi A, Takanobu H (2001) Experimental study on robot personality for humanoid head robot. In: Proceedings 2001 IEEE/RSJ international conference on intelligent robots and systems. Expanding the societal role of robotics in the the next millennium (Cat. No. 01CH37180), vol 2. IEEE, pp 1183–1188 – reference: Read R, Belpaeme T (2012) How to use non-linguistic utterances to convey emotion in child–robot interaction. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 219–220 – reference: Vithanawasam T, Madhusanka B (2019) Face and upper-body emotion recognition using service robot’s eyes in a domestic environment. In: 2019 international research conference on smart computing and systems engineering (SCSE). IEEE, pp 44–50 – reference: LawTChita-TegmarkMScheutzMThe interplay between emotional intelligence, trust, and gender in human–robot interactionInt J Soc Robot20208613 – reference: GualaFOn the scope of experiments in economics: comments on siakantarisCamb J Econ2002262261267 – reference: Kim EH, Kwak SS, Kwak YK (2009) Can robotic emotional expressions induce a human to empathize with a robot? In: RO-MAN 2009—the 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 358–362 – reference: StaffordRQMacDonaldBALiXBroadbentEOlder people’s prior robot attitudes influence evaluations of a conversational robotInt J Soc Robot201462281297 – reference: ErolBAMajumdarABenavidezPRadPChooKKRJamshidiMToward artificial emotional intelligence for cooperative social human–machine interactionIEEE Trans Comput Soc Syst201971234246 – reference: Stock R, Gross M (2016) How does knowledge workers’ social technology readiness affect their innovative work behavior? In: 2016 49th Hawaii international conference on system sciences (HICSS). IEEE, pp 2166–2175 – reference: Shayganfar M, Rich C, Sidner CL (2012) A design methodology for expressing emotion on robot faces. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 4577–4583 – reference: ZhangLJiangMFaridDHossainMAIntelligent facial emotion recognition and semantic-based topic detection for a humanoid robotExpert Syst Appl2013401351605168 – reference: Chen TL, King CH, Thomaz AL, Kemp CC (2011) Touched by a robot: An investigation of subjective responses to robot-initiated touch. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 457–464 – reference: Mutlu B, Yamaoka F, Kanda T, Ishiguro H, Hagita N (2009) Nonverbal leakage in robots: communication of intentions through seemingly unintentional behavior. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 69–76 – reference: Abd Latif MH, Yusof HM, Sidek S, Rusli N (2015) Thermal imaging based affective state recognition. In: 2015 IEEE international symposium on robotics and intelligent sensors (IRIS). IEEE, pp 214–219 (2015) – reference: TajfelHHuman groups and social categories: studies in social psychology1981CambridgeCambridge University Press – reference: Chita-Tegmark M, Lohani M, Scheutz, M (2019) Gender effects in perceptions of robots and humans with varying emotional intelligence. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 230–238 – reference: Schaaff K, Schultz T (2009) Towards an EEG-based emotion recognizer for humanoid robots. In: RO-MAN 2009-The 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 792–796 – reference: Ahmed TU, Hossain S, Hossain MS, Ul Islam R, Andersson K (2019) Facial expression recognition using convolutional neural network with data augmentation. In: 2019 Joint 8th international conference on informatics, electronics & vision (ICIEV) and 2019 3rd international conference on imaging, vision & pattern recognition (icIVPR). IEEE, pp. 336–341 – reference: Shao M, Alves SFDR, Ismail O, Zhang X, Nejat G, Benhabib, B(2019) You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, pp 3811–3817 – reference: Bera A, Randhavane T, Prinja R, Kapsaskis K, Wang A, Gray K, Manocha D (2019) The emotionally intelligent robot: improving social navigation in crowded environments, pp 257–266. arXiv preprint arXiv:1903.03217 – reference: Cid F, Manso LJ, Núnez P (2015) A novel multimodal emotion recognition approach for affective human robot interaction. In: Proceedings of fine, pp 1–9 – reference: Stock RM (2016) Emotion transfer from frontline social robots to human customers during service encounters: Testing an artificial emotional contagion modell. In: 2016 international conference on information systems research (ICIS – reference: BuenoLBrunettiFFrizeraAPonsJLMorenoJPonsJLHuman-robot cognitive interactionWearable Robots Biomechatronic Exoskeletons2008New YorkWiley87126 – reference: Rani P, Sarkar N (2004) Emotion-sensitive robots-a new paradigm for human–robot interaction. In: 4th IEEE/ras international conference on humanoid robots, 2004, vol 1. IEEE, pp 149–167 – reference: VásquezBPEAMatíaFA tour-guide robot: moving towards interaction with humansEng Appl Artif Intell202088103356 – reference: XuQNgJTanOHuangZTayBParkTMethodological issues in scenario-based evaluation of human–robot interactionInt J Soc Robot201572279291 – reference: SiegwartRArrasKOBouabdallahSBurnierDFroidevauxGGreppinXJensenBLorotteAMayorLMeisserMRobox at expo 0.2: a large-scale installation of personal robotsRobot Auton Syst2003423–42032221011.68802 – reference: TajfelHCognitive aspects of prejudiceJ Biosoc Sci19691S1173191 – reference: LarsenRJDienerEClarkMSPromises and problems with the circumplex model of emotionReview of personality and social psychology: emotion1992Newbury ParkSage2559 – reference: AlyATapusATowards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interactionAuton Robot2016402193209 – reference: Hyun KH, Kim EH, Kwak YK (2007) Emotional feature extraction based on phoneme information for speech emotion recognition. In: RO-MAN 2007-The 16th IEEE international symposium on robot and human interactive communication. IEEE, pp 802–806 – reference: PlutchikREConteHRCircumplex models of personality and emotions1997WashingtonAmerican Psychological Association – reference: BroadbentELeeYIStaffordRQKuoIHMacDonaldBAMental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human-robot interactionInt J Soc Robot201133291298 – reference: Wu Z, Zheng L (2019) Emotional communication robot based on 3d face model and ASR technology. In: 2019 IEEE 9th international conference on electronics information and emergency communication (ICEIEC). IEEE, pp 1–4 – reference: MehrabianARussellJAAn approach to environmental psychology1974CambridgeThe MIT Press – reference: TurleyLWMillimanREAtmospheric effects on shopping behavior: a review of the experimental evidenceJ Bus Res2000492193211 – reference: Kim Mg, Lee HS, Park JW, Jo SH, Chung MJ (2008) Determining color and blinking to support facial expression of a robot for conveying emotional intensity. In: RO-MAN 2008—The 17th IEEE international symposium on robot and human interactive communication. IEEE, pp 219–224 – reference: Cañamero LD, Fredslund, J (2000) How does it feel? emotional interaction with a humanoid LEGO robot. In: Proceedings of American association for artificial intelligence fall symposium, FS-00-04, pp. 7–16 – reference: Mizanoor RS, Spencer DA, Wang X, Wang Y (2014) Dynamic emotion-based human–robot collaborative assembly in manufacturing: the preliminary concepts. In: workshop on human-robot collaboration for industrial manufacturing at RSS’14 (2014) – reference: Kansizoglou I, Bampis L, Gasteratos A (2019) An active learning paradigm for online audio-visual emotion recognition. IEEE Trans Affective Computing – reference: Moosaei M, Das SK, Popa DO, Riek L.D(2017) Using facially expressive robots to calibrate clinical pain perception. In: 2017 12th ACM/ieee international conference on human–robot interaction (HRI). IEEE, pp 32–41 – reference: Kim HR, Lee K, Kwon DS (2005) Emotional interaction model for a service robot. In: ROMAN 2005. In: IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 672–678 – reference: Sugunan N, Alekh V, Krishna S, Babu SK, Bhavani RR, et al (2018) Design and emotional evaluation of pepe jr: A cost-effective platform for human robot interaction studies. In: 2018 IEEE distributed computing, VLSI, electrical circuits and robotics (DISCOVER). IEEE, pp 76–81 – reference: Thiessen R, Rea DJ, Garcha DS, Cheng C, Young J E (2019) Infrasound for HRI: a robot using low-frequency vibrations to impact how people perceive its actions. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 11–18 – reference: Becker-Asano C, Ishiguro H (2011a) Evaluating facial displays of emotion for the android robot geminoid f. In: 2011 IEEE Workshop on Affective Computational Intelligence (WACI), pp. 1–8. IEEE – reference: Hanson D(2006) Exploring the aesthetic range for humanoid robots. In: Proceedings of the ICCS/CogSci-2006 long symposium: toward social mechanisms of android science. Citeseer, pp 39–42 – reference: Yu C, Xu L (2004) An emotion-based approach to decision making and self learning in autonomous robot control. In: Fifth world congress on intelligent control and automation (IEEE Cat. No. 04EX788), vol 3. IEEE, pp 2386–2390 – reference: ZhengXShiomiMMinatoTIshiguroHWhat kinds of robot’s touch will match expressed emotions?IEEE Robot Automa Lett201951127134 – reference: Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI. IEEE), pp 53–60 – reference: Stock RM, Merkle M (2018) Can humanoid service robots perform better than service employees? a comparison of innovative behavior cues. In: Proceedings of the 51st Hawaii international conference on system sciences – reference: EkmanPSorensonERFriesenWVPan-cultural elements in facial displays of emotionScience196916438758688 – reference: Rahman S, Wang Y (2015) Dynamic affection-based motion control of a humanoid robot to collaborate with human in flexible assembly in manufacturing. In: ASME 2015 dynamic systems and control conference. American Society of Mechanical Engineers Digital Collection – reference: NacharNThe Mann–Whitney u: a test for assessing whether two independent samples come from the same distributionTutor Quant Methods Psychol2008411320 – reference: Sun M, Mou Y, Xie H, Xia M, Wong M, Ma X (2019) Estimating emotional intensity from body poses for human–robot interaction. arXiv preprint arXiv:1904.09435 – reference: XuJBroekensJHindriksKNeerincxMAMood contagion of robot body language in human robot interactionAuton Agent Multi Agent Syst201529612161248 – reference: Kwon OH, Koo SY, Kim YG, Kwon DS (2010) Telepresence robot system for English tutoring. In: 2010 IEEE workshop on advanced robotics and its social impacts. IEEE, pp 152–155 – reference: RázuriJGSundgrenDRahmaniRMoranABonetILarssonASpeech emotion recognition in emotional feedback for human–robot interactionInt J Adv Res Artif Intell (IJARAI)2015422027 – reference: ErdenMSEmotional postures for the humanoid-robot NAOInt J Soc Robot201354441456 – reference: MoriMThe uncanny valleyEnergy1970743335 – reference: Lee HS, Kang BY (2019) Continuous emotion estimation of facial expressions on Jaffe and CK+ datasets for human–robot interaction. In: Intelligent service robotics, pp 1–13 – reference: HaslamNDehumanization: an integrative reviewPers Soc Psychol Rev2006103252264 – reference: Han J, Campbell N, Jokinen K, Wilcock G (2012) Investigating the use of non-verbal cues in human-robot interaction with a NAO robot. In: 2012 IEEE 3rd international conference on cognitive infocommunications (CogInfoCom). IEEE, pp 679–683 – reference: ParkEJinDdel PobilAPThe law of attraction in human–robot interactionInt J Adv Rob Syst20129235 – reference: Tajfel H, Turner JC, Austin WG, Worchel S (1979) An integrative theory of intergroup conflict. In: Organizational identity: a reader, vol 56, p 65 – reference: Häring M, Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: 2011 RO-MAN. IEEE, pp 204–209 – reference: Homburg N (2018) How to include humanoid robots into experimental research: a multi-step approach. In: Proceedings of the 51st Hawaii international conference on system sciences (2018) – reference: KennyDAReflections on mediationOrgan Res Methods20081123533582525831 – reference: ChungSRyooHLevel values of robot visual interface factors based on users’ experience on screen, light, faceInt J Control Autom2018115117 – reference: PodsakoffPMMacKenzieSBLeeJYPodsakoffNPCommon method biases in behavioral research: a critical review of the literature and recommended remediesJ Appl Psychol2003885879 – reference: BielingGStockRMDorozallaFCoping with demographic change in job markets: how age diversity management contributes to organisational performanceGerman J Hum Resour Manag2015291530 – reference: Benamara NK, Val-Calvo M, Álvarez-Sánchez JR, Díaz-Morcillo A, Vicente JMF, Fernández-Jover E, Stambouli TB (2019) Real-time emotional recognition for sociable robotics based on deep neural networks ensemble. In: International work-conference on the interplay between natural and artificial computation. Springer, pp 171–180 – reference: SeoSHGriffinKYoungJEBuntAPrenticeSLoureiro-RodríguezVInvestigating people’s rapport building and hindering behaviors when working with a collaborative robotInt J Soc Robot2018101147161 – reference: Embgen S, Luber M, Becker-Asano C, Ragni M, Evers V, Arras KO (2012) Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 1019–1025 – reference: McCollDHongAHatakeyamaNNejatGBenhabibBA survey of autonomous human affect detection methods for social robots engaged in natural HRIJ Intell Robot Syst2016821101133 – reference: TajfelHBilligMGBundyRPFlamentCSocial categorization and intergroup behaviourEur J Soc Psychol197112149178 – reference: AnzaiYHuman–robot–computer interaction: a new paradigm of research in roboticsAdv Robot199384357369 – reference: Klug M, Zell A (2013) Emotion-based human–robot–interaction. In: 2013 IEEE 9th international conference on computational cybernetics (ICCC). IEEE, pp 365–368 – reference: Song S, Yamada S (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: 2017 12th ACM/IEEE international conference on human-robot interaction. IEEE, pp 2–11 – reference: FestingerLA theory of social comparison processesHum Relat195472117140 – reference: KandaTShiomiMMiyashitaZIshiguroHHagitaNA communication robot in a shopping mallIEEE Trans Rob2010265897913 – reference: SnyderHLiterature review as a research methodology: an overview and guidelinesJ Bus Res2019104333339 – reference: Lehmann H, Broz F (2018) Contagious yawning in human–robot interaction. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction, pp 173–174 – reference: Kitagawa Y, Ishikura T, Song W, Mae Y, Minami M, Tanaka K (2009) Human-like patient robot with chaotic emotion for injection training. In: 2009 ICCAS-SICE. IEEE, pp 4635–4640 – reference: Matsui D, Minato T, MacDorman KF, Ishiguro H(2005) Generating natural motion in an android by mapping human motion. In: 2005 IEEE/rsj international conference on intelligent robots and systems. IEEE, pp 3301–3308 – reference: Park CH, Sim K.B (2003) Emotion recognition and acoustic analysis from speech signal. In: Proceedings of the international joint conference on neural networks, 2003, vol 4. IEEE, pp 2594–2598 – reference: AndreassonRAlenljungBBillingELoweRAffective touch in human–robot interaction: conveying emotion to the NAO robotInt J Soc Robot2018104473491 – reference: OgataTSuganoSEmotional communication between humans and the autonomous robot Wamoeba-2 (Waseda amoeba) which has the emotion modelJSME Int J Ser C2000433568574 – reference: BroadbentEKuoIHLeeYIRabindranJKerseNStaffordRMacDonaldBAAttitudes and reactions to a healthcare robotTelemed e-Health2010165608613 – reference: KimHRKwonDSComputational model of emotion generation for human–robot interaction based on the cognitive appraisal theoryJ Intell Robot Syst20106022632831203.68243 – reference: Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for kobian humanoid robot—preliminary experiments with different emotional patterns. In: RO-MAN 2009-The 18th IEEE international symposium on robot and human interactive communication. IEEE, pp 381–386 – reference: Shi Y, Chen Y, Ardila LR, Venture G, Bourguet, ML (2019) A visual sensing platform for robot teachers. In: Proceedings of the 7th international conference on human–agent interaction, pp 200–201 – reference: Hollinger GA, Georgiev Y, Manfredi A, Maxwell BA, Pezzementi ZA, Mitchell B (2006) Design of a social mobile robot using emotion-based decision mechanisms. In: 2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 3093–3098 – reference: LiuZWuMCaoWChenLXuJZhangRZhouMMaoJA facial expression emotion recognition based human–robot interaction systemIEEC/CAA J Automat Sinica201744668676 – reference: EkmanPEmotions revealedBMJ2004328Suppl S50405184 – reference: BreazealCAryanandaLRecognition of affective communicative intent in robot-directed speechAuton Robot2002121831041012.68638 – reference: Ghazali AS, Ham J, Markopoulos P, Barakova EI (2019a) Investigating the effect of social cues on social agency judgement. In: HRI, pp 586–587 – reference: Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2009) An affective guide robot in a shopping mall. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 173–180 – reference: Zhang Z, Niu Y, Wu, S, Lin SM, Kong L (2018) Analysis of influencing factors on humanoid robots’ emotion expressions by body language. In: International symposium on neural networks. Springer, pp 775–785 – reference: Chastagnol C, Clavel C, Courgeon M, Devillers, L (2014) Designing an emotion detection system for a socially intelligent human–robot interaction. In: Natural interaction with robots, knowbots and smartphones. Springer, pp 199–211 – reference: MacDormanKFIshiguroHThe uncanny advantage of using androids in cognitive and social science researchInteract Stud200673297337 – reference: Val-Calvo M, Álvarez-Sánchez JR, Díaz-Morcillo A, Vicente JMF, Fernández-Jover E (2019) On the use of lateralization for lightweight and accurate methodology for EEG real time emotion estimation using Gaussian-process classifier. In: International work-conference on the interplay between natural and artificial computation. Springer, pp 191–201 – reference: Beck A, Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: 19th international symposium in robot and human interactive communication. IEEE, pp 464–469 – reference: Duncan S, Fiske DW (2015) Face-to-face interaction: research, methods, and theory. Routledge – reference: Chen H, Gu Y, Wang F, Sheng W (2018) Facial expression recognition and positive emotion incentive system for human–robot interaction. In: 2018 13th world congress on intelligent control and automation (WCICA). IEEE, pp 407–412 – reference: DziergwaMKaczmarekMKaczmarekPKędzierskiJWadas-SzydłowskaKLong-term cohabitation with a social robot: a case study of the influence of human attachment patternsInt J Soc Robot2018101163176 – reference: TanakaFCicourelAMovellanJRSocialization between toddlers and robots at an early childhood education centerProc Natl Acad Sci2007104461795417958 – reference: Ilić D, Žužić I, Brščić D (2019) Calibrate my smile: robot learning its facial expressions through interactive play with humans. In: Proceedings of the 7th international conference on human–agent interaction, pp 68–75 – reference: DörnerDGüssCDPsi: a computational architecture of cognition, motivation, and emotionRev Gen Psychol2013173297317 – reference: GunesHCeliktutanOSariyanidiELive human-robot interactive public demonstrations with automatic emotion and personality predictionPhilos Trans R Soc B2019374177118 – reference: LeiteIMartinhoCPaivaASocial robots for long-term interaction: a surveyInt J Soc Robot201352291308 – reference: TosiHLJrSlocumJWJrContingency theory: some suggested directionsJ Manag1984101926 – reference: SpekmanMLKonijnEAHoornJFPerceptions of healthcare robots as a function of emotion-based coping: the importance of coping appraisals and coping strategiesComput Hum Behav201885308318 – reference: TajfelHSocial identity and intergroup relations1982CambridgeCambridge University Press – reference: ParkinsonBEmotions are socialBr J Psychol1996874663683 – reference: Gonsior B, Sosnowski S, Buß M, Wollherr D, Kühnlenz K(2015) An emotional adaption approach to increase helpfulness towards a robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 2429–2436 – reference: XinLLunXZhi-liangWDong-meiFRobot emotion and performance regulation based on HMMInt J Adv Rob Syst2013103160 – reference: de MeloCMTeradaKCooperation with autonomous machines through culture and emotionPloS One20191411e0224758 – reference: DonovanRRossiterJStore atmosphere: an environmental psychology approachJ Retail19825813457 – reference: Acosta M, Kang D, Choi HJ (2008) Robot with emotion for triggering mixed-initiative interaction planning. In: 2008 IEEE 8th international conference on computer and information technology workshops. IEEE, pp 98–103 – reference: de Graaf MM, Allouch SB, van Dijk JA (2016) Long-term acceptance of social robots in domestic environments: insights from a user’s perspective. In: 2016 AAAI spring symposium series (2016) – reference: Rawal N, Stock-Homburg R.(2021) Facial emotion expressions in human–robot interaction: a survey. Int J Social Robot in press, arXiv preprints arXiv:2103.07169 – reference: Evers V, Maldonado H, Brodecki T, Hinds P (2008) Relational vs. group self-construal: untangling the role of national culture in HRI. In: 2008 3rd ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 255–262 – reference: GarrellAVillamizarMMoreno-NoguerFSanfeliuATeaching robot’s proactive behavior using human assistanceInt J Soc Robot201792231249 – reference: WhiteRTArziHJLongitudinal studies: designs, validity, practicality, and valueRes Sci Educ2005351137149 – reference: Anjum M (2019) Emotion recognition from speech for an interactive robot agent. In: 2019 IEEE/SICE international symposium on system integration (SII). IEEE, pp 363–368 – reference: BreazealCBrooksRFellousJ-MArbibMARobot emotion: a functional perspectiveWho needs emotions? The brain meets the robot2005OxfordOxford University Press271310 – reference: YanZJouandeauNCherifAAA survey and analysis of multi-robot coordinationInt J Adv Rob Syst20131012399417 – reference: Sung J, Christensen HI, Grinter RE (2009) Robots in the wild: understanding long-term use. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, pp 45–52 – reference: VenkateshVBrownSABalaHBridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systemsMIS Q20133742154 – reference: BeckACañameroLHiolleADamianoLCosiPTesserFSommavillaGInterpretation of emotional body language displayed by a humanoid robot: a case study with childrenInt J Soc Robot201353325334 – reference: Azuar D, Gallud G, Escalona F, Gomez-Donoso F, Cazorla M (219) A story-telling social robot with emotion recognition capabilities for the intellectually challenged. In: Iberian robotics conference. Springer, pp 599–609 – reference: Dodd W, Gutierrez R (2005) The role of episodic memory and emotion in a cognitive robot. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 692–697 – reference: ArnoldHJModerator variables: a clarification of conceptual, analytic, and psychometric issuesOrgan Behav Hum Perform1982292143174 – reference: Jung MF(2017) Affective grounding in human–robot interaction. In: 2017 12th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 263–273 – reference: ChenLSuWFengYWuMSheJHirotaKTwo-layer fuzzy multiple random forest for speech emotion recognition in human–robot interactionInf Sci2020509150163 – reference: FolkmanSLazarusRSDunkel-SchetterCDeLongisAGruenRJDynamics of a stressful encounter: cognitive appraisal, coping, and encounter outcomesJ Pers Soc Psychol19865059921003 – reference: ReyesMEMezaIVPinedaLARobotics facial expression of anger in collaborative human–robot interactionInt J Adv Rob Syst20191611729881418817972 – reference: Xu J, Broekens J, Hindriks K, Neerincx M.A (2014) Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the 2014 international conference on Autonomous agents and multi-agent systems, pp 973–980. International foundation for autonomous agents and multiagent systems – reference: Valenti A, Chita-Tegmark M, Law T, Bock A, Oosterveld B, Scheutz M (2019) When your face and tone of voice don’t say it all: inferring emotional state from word semantics and conversational topics. In: Workshop on Cognitive Architectures for HRI: embodied models of situated natural language interactions of AAHAS 2019. Montreal, Canada – reference: KandaTHiranoTEatonDIshiguroHInteractive robots as social partners and peer tutors for children: a field trialHum Comput Interact2004191–26184 – reference: MoriMMacDormanKFKagekiNThe uncanny valley [from the field]IEEE Robot Autom Mag201219298100 – reference: YooCParkJMacInnisDJEffects of store characteristics and in-store emotional experiences on store attitudeJ Bus Res1998423253263 – reference: Scheutz M, Schermerhorn P, Kramer J (2006) The utility of affect expression in natural language interactions in joint human–robot tasks. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, pp 226–233 – reference: Davis FD (1985) A technology acceptance model for empirically testing new end-user information systems: theory and results. Ph.D. thesis, Massachusetts Institute of Technology – reference: Marmpena M, Lim A, Dahl TS, Hemion N (2019) Generating robotic emotional body language with variational autoencoders. In: 2019 8th international conference on affective computing and intelligent interaction (ACII). IEEE, pp 545–551 – reference: KanohMIwataSKatoSItohHEmotive facial expressions of sensitivity communication robot “ifbot.”Kansei Eng Int2005533542 – reference: Kory-WestlundJMBreazealCExploring the effects of a social robot’s speech entrainment and backstory on young children’s emotion, rapport, relationship, and learningFront Robot AI2019654124 – reference: ButlerEAEgloffBWilhelmFHSmithNCEricksonEAGrossJJThe social consequences of expressive suppressionEmotion2003314867 – reference: Le BV, Lee S (2014) Adaptive hierarchical emotion recognition from speech signal for human–robot communication. In: 2014 tenth international conference on intelligent information hiding and multimedia signal processing. IEEE, pp 807–810 – reference: CameronDMillingsAFernandoSCollinsECMooreRSharkeyAEversVPrescottTThe effects of robot facial emotional expressions and gender on child–robot interaction in a field studyConnect Sci2018304343361 – reference: Trovato G, Kishi T, Endo N, Hashimoto K, Takanishi A (2012) Development of facial expressions generator for emotion expressive humanoid robot. In: 2012 12th IEEE-RAS international conference on humanoid robots (humanoids 2012). IEEE, pp 303–308 – reference: TomkinsSSAffect theoryApproach Emot1984163163195 – reference: TrovatoGRamosJGAzevedoHMoroniAMagossiSSimmonsRIshiiHTakanishiAA receptionist robot for Brazilian people: study on interaction involving illiteratesPaladyn J Behav Roboti201781117 – reference: Stock R, Merkle M, Eidens D, Hannig M, Heineck P, Nguyen MA, Völker J (2019) When robots enter our workplace: understanding employee trust in assistive robots – reference: Wu M, Su W, Chen L, Liu Z, Cao W, Hirota K (2019) Weight-adapted convolution neural network for facial expression recognition in human–robot interaction. IEEE Trans Syst Man Cybern Syst 5(1):1473–1484 – reference: BaumeisterRFLearyMRWriting narrative literature reviewsRev Gen Psychol199713311320 – reference: Cañamero D (1997) Modeling motivations and emotions as a basis for intelligent behavior. In: Proceedings of the first international conference on autonomous agents, pp 148–155 – reference: LevittSDListJAWhat do laboratory experiments measuring social preferences reveal about the real world?J Econ Perspect2007212153174 – reference: TsiourtiCWeissAWacKVinczeMMultimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robotsInt J Soc Robot2019114555573 – reference: DubalSFoucherAJouventRNadelJHuman brain spots emotion in non humanoid robotsSoc Cognit Affect Neurosci2011619097 – reference: BishopLvan MarisADogramadziSZookNSocial robots: the influence of human and robot characteristics on acceptancePaladyn J Behav Robot2019101346358 – reference: Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement, consulting psychologists press. Palo Alto – reference: KimEHHyunKHKimSHKwakYKImproved emotion recognition with a novel speaker-independent featureIEEE/ASME Trans Mechatron2009143317325 – reference: NishioSTauraKSumiokaHIshiguroHTeleoperated android robot as emotion regulation mediaInt J Soc Robot201354563573 – reference: ZhangTKaberDBZhuBSwangnetrMMosalyPHodgeLService robot feature design effects on user perceptions and emotional responsesIntel Serv Robot2010327388 – reference: GhaniDAIshakSBARelationship between the art of wayang kulit and disney’s twelve principles of animationRev Res Soc Interv201237162179 – reference: Greco A, Roberto A, Saggese A, Vento M, Vigilante V (2019) Emotion analysis from faces for social robotics. In: 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, pp 358–364 – reference: Huang JY, Lee WP, Dong BW (2019) Learning emotion recognition and response generation for a service robot. In: IFToMM international symposium on robotics and mechatronics. Springer, pp 286–297 – reference: Forlizzi J (2007) How robotic products become social products: an ethnographic study of cleaning in the home. In: 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 129–136 – reference: Baraka K, Alves-Oliveira P, Ribeiro T (2019) An extended framework for characterizing social robots. arXiv preprint arXiv:1907.09873 – reference: TeradaKTakeuchiCEmotional expression in simple line drawings of a robot’s face leads to higher offers in the ultimatum gameFront Psychol2017872419 – reference: SongKTHanMJWangSCSpeech signal-based emotion recognition and its application to entertainment robotsJ Chin Inst Eng20143711425 – reference: Danev L, Hamann M, Fricke N, Hollarek T, Paillacho D (2017) Development of animated facial expressions to express emotions in a robot: roboticon. In: 2017 IEEE second ecuador technical chapters meeting (ETCM). IEEE, pp 1–6 – reference: Ekman P (2005) Handbook of cognition and emotion, chap. basic emotions – reference: Alonso-MartinFMalfazMSequeiraJGorostizaJFSalichsMAA multimodal emotion detection system during human–robot interactionSensors201313111554915581 – reference: DengJPangGZhangZPangZYangHYangGcGAN based facial expression recognition for human–robot interactionIEEE Access2019798489859 – reference: Park CH, Javed H, Jeon M (2019) Consensus-based human–agent interaction model for emotion regulation in ASD. In: International conference on human–computer interaction. Springer, pp 295–301 – reference: Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 314–321 – reference: Valenti A, Chita-Tegmark M, Gold M, Law T, Scheutz M (2019) In their own words: a companion robot for detecting the emotional state of persons with Parkinson’s disease. In: International conference on social robotics. Springer, pp 443–452 – reference: Koschate M, Potter R, Bremner P, Levine M (2016) Overcoming the uncanny valley: displays of emotions reduce the uncanniness of humanlike robots. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 359–366 – reference: WangSLilienfeldSORochatPThe uncanny valley: existence and explanationsRev Gen Psychol2015194393407 – reference: Maeda Y, Geshi S (2018) Human–robot interaction using Markovian emotional model based on facial recognition. In: 2018 Joint 10th international conference on soft computing and intelligent systems (SCIS) and 19th international symposium on advanced intelligent systems (ISIS). IEEE, pp 209–214 – reference: Stock RM, Merkle M (2017) A service robot acceptance model: User acceptance of humanoid robots during service encounters. In: 2017 IEEE international conference on pervasive computing and communications workshops (PerCom Workshops). IEEE, pp 339–344 – reference: ThompsonLFGillanDJBarnesMJentschFSocial factors in human-robot interactionHuman-robot interactions in future military operations2016SurreyAshgate6781 – reference: Obaid M, Kuchenbrandt D, Bartneck C (2014) Empathy and yawn contagion: can we (humans) catch yawns from robots? In: Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction, pp 260–261 – reference: ŻarkowskiMMulti-party turn-taking in repeated human–robot interactions: an interdisciplinary evaluationInt J Soc Robot2019115693707 – reference: Stock R, Nguyen MA (2019) Robotic psychology. what do we know about human–robot interaction and what do we still need to learn? In: Proceedings of the 52nd Hawaii international conference on system sciences, pp 1936–1945 – reference: Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2004) Various emotional expressions with emotion expression humanoid robot WE-4RII. In: IEEE conference on robotics and automation, 2004. TExCRA technical exhibition based. IEEE, pp 35–36 – reference: BagozziRPPrinciples of marketing management1986ChicagoScience Research Associates – reference: Gockley R, Simmons R, Forlizzi J (2006) Modeling affect in socially interactive robots. In: ROMAN 2006-The 15th IEEE international symposium on robot and human interactive communication. IEEE, pp 558–563 – reference: Kwon OW, Chan K, Hao J, Lee TW (2003) Emotion recognition by speech signals. In: Eurospeech, Geneva, pp 125–128 – reference: Chao-gang W, Jie-yu Z, Yuan-yuan Z (2008) An emotion generation model for interactive virtual robots. In: 2008 international symposium on computational intelligence and design, vol 2. IEEE, pp 238–241 – reference: GácsiMKisAFaragóTJaniakMMuszyńskiRMiklósiÁHumans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviourComput Hum Behav201659411419 – reference: Ganster T, Eimler SC, Von Der Pütten A, Hoffmann L, Krämer NC, von der Pütten A (2010) Methodological considerations for long-term experience with robots and agents – reference: Prasad V, Stock-Homburg R, Peters J (2021) Human–robot handshaking: a review. Int J Soc Robot in press. https://doi.org/10.1007/s12369-021-00763-2 – reference: Müller NH, Truschzinski M (2014) An emotional framework for a real-life worker simulation. In: International conference on human–computer interaction. Springer, pp 675–686 – reference: OrtonyACloreGCollinsAThe cognitive structure of emotions1988New YorkCambridge University Press – reference: Hochschild AR (2012) The managed heart: commercialization of human feeling. Univ of California Press – reference: KimJHKimBGRoyPPJeongDMEfficient facial expression recognition algorithm based on hierarchical deep neural network structureIEEE Access201974127341285 – reference: GoulartCValadãoCDelisle-RodriguezDFunayamaDFavaratoABaldoGBinotteVCaldeiraEBastos-FilhoTVisual and thermal image processing for facial specific landmark detection to infer emotions in a child–robot interactionSensors201919132844 – reference: BrownSAVenkateshVGoyalSExpectation confirmation in information systems researchMIS Q2014383729756 – reference: LevittSDListJAField experiments in economics: the past, the present, and the futureEur Econ Rev2009531118 – reference: Deshmukh A, Babu SK, Unnikrishnan R, Ramesh S, Anitha P, Bhavani RR (2019)Influencing hand-washing behaviour with a social robot: Hri study with school children in rural India. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1–6 – reference: HeerinkMKröseBEversVWielingaBThe influence of social presence on acceptance of a companion robot by older peopleJ Phys Agents2008223340 – reference: BennettCCŠabanovićSDeriving minimal features for human-like facial expressions in robotic facesInt J Soc Robot201463367381 – reference: JimenezFYoshikawaTFuruhashiTKanohMAn emotional expression model for educational-support robotsJ Artif Intell Soft Comput Res2015515157 – reference: YangJWangRGuanXHassanMMAlmogrenAAlsanadAAi-enabled emotion-aware robot: the fusion of smart clothing, edge clouds and roboticsFutur Gener Comput Syst2020102701709 – reference: Aly A, Tapus A (2013) A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human–robot interaction. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 325–332 – reference: Nunes ARV (2019) Deep emotion recognition through upper body movements and facial expression, student report spring semester, Aalborg University – reference: Leite I, McCoy M, Lohani M, Ullman D, Salomons N, Stokes C, Rivers S, Scassellati BEmotional (2015)storytelling in the classroom: individual versus group interaction between children and robots. In: 2015 10th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 75–82 – reference: Itoh K, Miwa H, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A(2006) Mechanical design of emotion expression humanoid robot WE-4RII. In: Romansy 16. Springer, pp 255–262 – reference: YamashitaYIshiharaHIkedaTAsadaMInvestigation of causal relationship between touch sensations of robots and personality impressions by path analysisInt J Soc Robot2019111141150 – reference: Zhang J, Xiao N (2020) Capsule network-based facial expression recognition method for a humanoid robot. In: Recent Trends in Intelligent Computing, Communication and Devices, pp. 113–121. Springer – reference: Hu Y, Hoffman G (2019) Using skin texture change to design emotion expression in social robots. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 2–10 (2019) – reference: LittlejohnSWFossKATheories of human communication2010Long GroveWaveland Press – reference: Inthiam J, Hayashi E, Jitviriya W, Mowshowitz A (2019) Mood estimation for human-robot interaction based on facial and bodily expression using a hidden Markov model. In: 2019 IEEE/SICE international symposium on system integration (SII). IEEE, pp 352–356 – reference: Becker-AsanoCIshiguroHIntercultural differences in decoding facial expressions of the android robot geminoid fJ Artif Intell Soft Comput Res201113215231 – reference: Fischer K, Jung M, Jensen LC, aus der Wieschen MV (2019) Emotion expression in HRI–when and why. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 29–38 – reference: de Graaf MM, Allouch SB, Van Dijk J (2015) What makes robots social? A user’s perspective on characteristics for social human–robot interaction. In: International conference on social robotics. Springer, pp 184–193 – reference: Pandya H, Patel H (2019) Facial affect detection using transfer learning: a comparative study, PsyArXiv Preprints, pp 1–5 – reference: SuYLiWBiNLvZAdolescents environmental emotion perception by integrating EEG and eye movementsFront Neurorobot20191346 – reference: Hegel F, Spexard T, Wrede B, Horstmann G, Vogt T (2006) Playing a different imitation game: interaction with an empathic android robot. In: 2006 6th IEEE-RAS international conference on humanoid robots IEEE, pp 56–61 – reference: Bryant D (2019) Towards emotional intelligence in social robots designed for children. In: Proceedings of the 2019 AAAI/ACM conference on AI, ethics, and society, pp 547–548 – reference: Hegel, F, Eyssel F, Wrede B (2010) The social robot ‘flobi’: key concepts of industrial design. In: 19th international symposium in robot and human interactive communication. IEEE, pp 107–112 – reference: Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. In: International conference on human–computer interaction. Springer, pp 133–142 – reference: ZimmermanDWComparative power of student t test and Mann–Whitney u test for unequal sample sizes and variancesJ Exp Educ1987553171174 – reference: Leyzberg D, Avrunin E, Liu J, Scassellati B (2011) Robots that express emotion elicit better human teaching. In: 2011 6th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 347–354 – reference: DemoulinSLeyensJPPaladinoMPRodriguez-TorresRRodriguez-PerezADovidioJDimensions of “uniquely” and “non-uniquely” human emotionsCognit Emot20041817196 – reference: TranfieldDDenyerDSmartPTowards a methodology for developing evidence-informed management knowledge by means of systematic reviewBr J Manag2003143207222 – reference: Sugaya, M (2019) Emotion aware robot by emotion estimation using biological sensors. In: 2019 IEEE international conference on pervasive computing and communications workshops (PerCom Workshops). IEEE, p 541 – reference: Zhu C, Ahmad W (2019) Emotion recognition from speech to improve human–robot interaction. In: 2019 IEEE international conference on dependable, autonomic and secure computing, international conference on pervasive intelligence and computing, international conference on cloud and big data computing, international conference on cyber science and technology congress (DASC/PiCom/CBDCom/CyberSciTech). IEEE, pp 370–375 – reference: ParkJSKimJHOhYHFeature vector classification based speech emotion recognition for service robotsIEEE Trans Consum Electron200955315901596 – reference: Bien ZZ, Kim JB, Kim DJ, Han JS, Do JH (2002) Soft computing based emotion/intention reading for service robot. In: AFSS international conference on fuzzy systems. Springer, pp 121–128 – reference: Lopez-Rincon A(2019) Emotion recognition using facial expressions in children using the nao robot. In: 2019 international conference on electronics, communications and computers (CONIELECOMP). IEEE, pp 146–153 – reference: Lisetti CL, Marpaung A (2005) A three-layered architecture for socially intelligent agents: modeling the multilevel process of emotions. In: International conference on affective computing and intelligent interaction. Springer, pp 956–963 – reference: Tsuchiya S, Imono M, Watabe H (2015) Judging emotion from EEGS, Procedia computer science, 60, pp 37–44 – reference: Sabelli AM, Kanda T, Hagita N (2011) A conversational robot in an elderly care center: an ethnographic study. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 37–44 – reference: LeventhalHSchererKThe relationship of emotion to cognition: a functional approach to a semantic controversyCogn Emot198711328 – reference: Castillo JC, Castro-González Á, Alonso-Martín F, Fernández-Caballero A, Salichs MÁ (2018) Emotion detection and regulation from personal assistant robot in smart environment. In: Personal assistants: emerging computational technologies. Springer, pp 179–195 – reference: Fong T, Thorpe C, Baur C (2003) Collaboration, dialogue, human-robot interaction. In: 10th international symposium on robotics research . Tracts in Advanced Robotics, vol 6. Springer, pp 255–266 – reference: ErogluSAMachleitKADavisLMAtmospheric qualities of online retailing: a conceptual model and implicationsJ Bus Res2001542177184 – reference: Yu C, Tapus A (2019) Interactive robot learning for multimodal emotion recognition. In: International conference on social robotics. Springer, pp 633–642 – reference: Michaud F, Robichaud E, Audet J (2001) Using motives and artificial emotions for prolonged activity of a group of autonomous robots. In: Proceedings of the AAAI fall symposium on emotions. Cape Code Massachussetts – reference: Bera A, Randhavane T, Manocha D (2019) Modelling multi-channel emotions using facial expression and trajectory cues for improving socially-aware robot navigation. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops – reference: Chen C, Garrod OG, Zhan J, Beskow J, Schyns PG, Jack R E (2018) Reverse engineering psychologically valid facial expressions of emotion into social robots. In: 2018 13th IEEE international conference on automatic face & gesture recognition (FG 2018). IEEE, pp 448–452 – reference: Tahon M, Delaborde A, Devillers L (2011) Real-life emotion 1762 detection from speech in human–robot interaction: experiments 1763 across diverse corpora with child and adult voices. In: Cosi P, De Mori R, Di Fabbrizio G, Pieraccini R (eds) Interspeech 2011, 12th annual conference of the international speech communication association, August 27–31, pp 3121–3124 – reference: WirtzJPattersonPGKunzWHGruberTLuVNPaluchSMartinsABrave new world: service robots in the frontlineJ Serv Manag2018295907931 – reference: DautenhahnKMethodology & themes of human–robot interaction: a growing research fieldInt J Adv Rob Syst200741103108 – reference: TakiRMaedaYTakahashiYPersonal preference analysis for emotional behavior response of autonomous robot in interactive emotion communicationJ Adv Comput Intell Intell Inform201047852859 – reference: Nadel J, Simon M, Canet P, Soussignan R, Blancard P, Canamero L, Gaussier P (2006) Human responses to an expressive robot. In: Proceedings of the sixth international workshop on epigenetic robotics. Lund University – reference: DevillersLTahonMSehiliMADelabordeAInference of human beings’ emotional states from speech in human–robot interactionsInt J Soc Robot201574451463 – reference: Di Lorenzo G, Pinelli F, Pereira FC, Biderman A, Ratti C, Lee C, Lee C (2009) An affective intelligent driving agent: driver’s trajectory and activities prediction. In: 2009 IEEE 70th vehicular technology conference fall. IEEE, pp 1–4 – reference: LeventhalHBerkowitzLToward a comprehensive theory of emotionAdvances in experimental social psychology1980New YorkAcademic Press139207 – reference: Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, pp 37–42 – reference: GalindoCFernández-MadrigalJAGonzálezJMultihierarchical interactive task planning: application to mobile roboticsIEEE Trans Syst Man Cybern Part B (Cybern)2008383785798 – reference: Hashimoto T, Hitramatsu S, Tsuji T, Kobayashi H (2006) Development of the face robot saya for rich facial expressions. In: 2006 SICE-ICASE international joint conference. IEEE, pp 5423–5428 – reference: Nomura T, Kanda T, Suzuki T, Kato K (2004) Psychology in human–robot communication: an attempt through investigation of negative attitudes and anxiety toward robots. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759). IEEE, pp 35–40 – reference: LeventhalHPlinerPBlanksteinKSpigelIMA perceptual-motor processing model of emotionPerception of emotion in self and others1979New YorkSpringer146 – reference: RussellJAA circumplex model of affectJ Pers Soc Psychol19803961161 – reference: Ribeiro T, Paiva, A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, pp 383–390 – reference: StockRMHow should customers be integrated for effective interorganizational NPD teams? An input-process-output perspectiveJ Prod Innov Manag20143135355513297227 – reference: GoddardKRoudsariAWyattJCAutomation bias: a systematic review of frequency, effect mediators, and mitigatorsJ Am Med Inform Assoc2012191121127 – reference: LiYJiangYTianDHuLLuHYuanZAi-enabled emotion communicationIEEE Netw20193361521 – reference: RuxtonGDThe unequal variance t-test is an underused alternative to student’s t-test and the Mann–Whitney u testBehav Ecol2006174688690 – reference: Bartneck, C (2003) Interacting with an embodied emotional character. In: Proceedings of the 2003 International Conference on Designing Pleasurable Products and Interfaces, pp. 55–60 – reference: Wang W, Athanasopoulos G, Patsis G, Enescu V, Sahli H (2014) Real-time emotion recognition from natural bodily expressions in child–robot interaction. In: European conference on computer vision. Springer, pp 424–435 – reference: DavisFDPerceived usefulness, perceived ease of use, and user acceptance of information technologyMIS Q1989133319340 – reference: Haring KS, Silvera-Tawil D, Matsumoto Y, Velonaki M, Watanabe K (2014) Perception of an android robot in Japan and Australia: a cross-cultural comparison. In: International conference on social robotics. Springer, pp 166–175 – reference: Le TL, Dong VT(2011) Toward a vietnamese facial expression recognition system for human–robot interaction. In: The 2011 international conference on advanced technologies for communications (ATC 2011). IEEE, pp 252–255 – reference: DandılEÖzdemirRReal-time facial emotion classification using deep learningData Sci Appl2019211317 – reference: MarmpenaMLimADahlTSHow does the robot feel? perception of valence and arousal in emotional body languagePaladyn J Behav Robot201891168182 – reference: Tielman M, Neerincx M, Meyer JJ, Looije, R (2014) Adaptive emotional expression in robot–child interaction. In: 2014 9th ACM/ieee international conference on human–robot interaction (HRI). IEEE, pp 407–414 – reference: Kozima H, Nakagawa C, Yasuda Y (2005) Interactive robots for communication-care: a case-study in autism therapy. In: ROMAN 2005. IEEE international workshop on robot and human interactive communication, 2005. IEEE, pp 341–346 – reference: Huang L, Gillan D (2014) An exploration of robot builders’ emotional responses to their tournament robots. In: Proceedings of the human factors and ergonomics society annual meeting, vol 58, pp 2013–2017. SAGE Publications Sage CA: Los Angeles, CA – reference: Woods S, Dautenhahn K, Schulz J (2004) The design space of robots: investigating children’s views. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759). IEEE, pp 47–52 – reference: KandaTSatoRSaiwakiNIshiguroHA two-month field trial in an elementary school for long-term human–robot interactionIEEE Trans Rob2007235962971 – reference: Kim HR (2010) Hybrid emotion generation architecture with computational models based on psychological theory for human–robot interaction. Diss. Ph. D. dissertation, Korea Adv. Inst. Sci. Technol., Daejeon, Korea – reference: HatfieldECacioppoJTRapsonRLEmotional contagionCurr Dir Psychol Sci19932396100 – reference: Tschöpe N, Reiser JE, Oehl M (2017) Exploring the uncanny valley effect in social robotics. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction, pp 307–308 – reference: Birnbaum GE, Mizrahi M, Hoffman G, Reis HT, Finkel EJ, Sass O (2016) Machines as a source of consolation: robot responsiveness increases human approach behavior and desire for companionship. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 165–172 – reference: Hoffman G, Birnbaum GE, Vanunu K, Sass O, Reis HT (2014) Robot responsiveness to human disclosure affects social impression and appeal. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, pp 1–8 – reference: BreazealCEmotion and sociable humanoid robotsInt J Hum Comput Stud2003591–2119155 – reference: WittigSKloosURätschMEmotion model implementation for parameterized facial animation in human–robot-interaction2016116439445 – reference: Charrier L, Rieger A, Galdeano A, Cordier A, Lefort M, Hassas S (2019) The rope scale: a measure of how empathic a robot is perceived. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 656–657 – reference: EysenckMWKeaneMTCognitive psychology: a student’s handbook2015PhiladelphiaPsychology Press – reference: GhazaliASHamJBarakovaEMarkopoulosPAssessing the effect of persuasive robots interactive social cues on users’ psychological reactance, liking, trusting beliefs and complianceAdv Robot2019337–8325337 – reference: Charrier L, Galdeano A, Cordier A, Lefort M (2018) Empathy display influence on human-robot interactions: a pilot study (2018) – reference: PodsakoffPMOrganDWSelf-reports in organizational research: problems and prospectsJ Manag1986124531544 – reference: KirbyRForlizziJSimmonsRAffective social robotsRobot Auton Syst2010583322332 – reference: Löffler D, Schmidt N, Tscharn R (2018) Multimodal expression of artificial emotion in social robots using color, motion and sound. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, pp 334–343 – reference: StockRMHoyerWDAn attitude-behavior model of salespeople’s customer orientationJ Acad Mark Sci2005334536552 – reference: Niemelä M, Arvola A, Aaltonen, I (2017) Monitoring the acceptance of a social service robot in a shopping mall: first results. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction, pp 225–226 – reference: OliverRLBalakrishnanPSBarryBOutcome satisfaction in negotiation: a test of expectancy disconfirmationOrgan Behav Hum Decis Process1994602252275 – reference: Thimmesch-GillZHarderKAKoutstaalWPerceiving emotions in robot body language: acute stress heightens sensitivity to negativity while attenuating sensitivity to arousalComput Hum Behav2017765967 – volume: 33 start-page: 325 issue: 7–8 year: 2019 ident: 778_CR91 publication-title: Adv Robot doi: 10.1080/01691864.2019.1589570 – ident: 778_CR92 doi: 10.1109/HRI.2019.8673266 – volume: 13 start-page: 319 issue: 3 year: 1989 ident: 778_CR58 publication-title: MIS Q doi: 10.2307/249008 – ident: 778_CR61 doi: 10.1109/RO-MAN46459.2019.8956367 – volume: 19 start-page: 121 issue: 1 year: 2012 ident: 778_CR94 publication-title: J Am Med Inform Assoc doi: 10.1136/amiajnl-2011-000089 – start-page: 87 volume-title: Wearable Robots Biomechatronic Exoskeletons year: 2008 ident: 778_CR36 doi: 10.1002/9780470987667.ch4 – volume: 8 start-page: 1 issue: 724 year: 2017 ident: 778_CR253 publication-title: Front Psychol – volume: 43 start-page: 568 issue: 3 year: 2000 ident: 778_CR196 publication-title: JSME Int J Ser C doi: 10.1299/jsmec.43.568 – volume: 10 start-page: 399 issue: 12 year: 2013 ident: 778_CR287 publication-title: Int J Adv Rob Syst doi: 10.5772/57313 – ident: 778_CR224 doi: 10.1109/IROS.2012.6385901 – ident: 778_CR27 doi: 10.1109/HRI.2016.7451748 – ident: 778_CR155 doi: 10.1109/ATC.2011.6027478 – ident: 778_CR170 doi: 10.1145/3171221.3171261 – ident: 778_CR23 doi: 10.1109/CVPRW.2019.00035 – volume: 11 start-page: 439 issue: 6 year: 2016 ident: 778_CR278 publication-title: Emotion model implementation for parameterized facial animation in human–robot-interaction – ident: 778_CR35 doi: 10.1145/3306618.3314319 – ident: 778_CR52 – volume: 102 start-page: 701 year: 2020 ident: 778_CR288 publication-title: Futur Gener Comput Syst doi: 10.1016/j.future.2019.09.029 – ident: 778_CR41 doi: 10.1007/978-3-319-62530-0_10 – ident: 778_CR220 doi: 10.1109/ROMAN.2009.5326306 – volume: 140 start-page: 93 year: 2015 ident: 778_CR298 publication-title: Comput Vis Image Underst doi: 10.1016/j.cviu.2015.07.007 – ident: 778_CR95 doi: 10.1109/IROS.2012.6385941 – volume: 8 start-page: 1 issue: 1 year: 2017 ident: 778_CR263 publication-title: Paladyn J Behav Roboti doi: 10.1515/pjbr-2017-0001 – volume: 12 start-page: 531 issue: 4 year: 1986 ident: 778_CR207 publication-title: J Manag – ident: 778_CR93 doi: 10.1109/ROMAN.2006.314448 – volume: 55 start-page: 1590 issue: 3 year: 2009 ident: 778_CR203 publication-title: IEEE Trans Consum Electron doi: 10.1109/TCE.2009.5278031 – ident: 778_CR292 doi: 10.1007/978-3-030-35888-4_59 – ident: 778_CR105 doi: 10.1109/ROMAN.2011.6005263 – ident: 778_CR268 doi: 10.1007/978-3-030-19591-5_20 – volume-title: The contingency theory of organizations year: 2001 ident: 778_CR65 doi: 10.4135/9781452229249 – ident: 778_CR181 doi: 10.1109/IROS.2001.976329 – ident: 778_CR57 – volume: 54 start-page: 177 issue: 2 year: 2001 ident: 778_CR77 publication-title: J Bus Res doi: 10.1016/S0148-2963(99)00087-9 – volume: 104 start-page: 333 year: 2019 ident: 778_CR227 publication-title: J Bus Res doi: 10.1016/j.jbusres.2019.07.039 – volume: 19 start-page: 393 issue: 4 year: 2015 ident: 778_CR274 publication-title: Rev Gen Psychol doi: 10.1037/gpr0000056 – volume: 53 start-page: 1 issue: 1 year: 2009 ident: 778_CR164 publication-title: Eur Econ Rev doi: 10.1016/j.euroecorev.2008.12.001 – start-page: 139 volume-title: Advances in experimental social psychology year: 1980 ident: 778_CR161 – volume: 37 start-page: 21 issue: 4 year: 2013 ident: 778_CR272 publication-title: MIS Q doi: 10.25300/MISQ/2013/37.1.02 – volume: 1 start-page: 215 issue: 3 year: 2011 ident: 778_CR20 publication-title: J Artif Intell Soft Comput Res – volume: 4 start-page: 103 issue: 1 year: 2007 ident: 778_CR56 publication-title: Int J Adv Rob Syst – volume: 14 start-page: e0224758 issue: 11 year: 2019 ident: 778_CR179 publication-title: PloS One doi: 10.1371/journal.pone.0224758 – volume: 49 start-page: 193 issue: 2 year: 2000 ident: 778_CR267 publication-title: J Bus Res doi: 10.1016/S0148-2963(99)00010-7 – volume: 42 start-page: 203 issue: 3–4 year: 2003 ident: 778_CR226 publication-title: Robot Auton Syst doi: 10.1016/S0921-8890(02)00376-7 – volume: 10 start-page: 160 issue: 3 year: 2013 ident: 778_CR282 publication-title: Int J Adv Rob Syst doi: 10.5772/55607 – volume: 4 start-page: 13 issue: 1 year: 2008 ident: 778_CR189 publication-title: Tutor Quant Methods Psychol doi: 10.20982/tqmp.04.1.p013 – ident: 778_CR225 doi: 10.1145/3349537.3352764 – ident: 778_CR269 doi: 10.1007/978-3-030-35888-4_41 – ident: 778_CR3 doi: 10.1109/ICIEV.2019.8858529 – ident: 778_CR121 doi: 10.1145/3349537.3351890 – ident: 778_CR149 doi: 10.1007/978-3-030-22643-5_10 – ident: 778_CR146 doi: 10.1109/HRI.2016.7451773 – ident: 778_CR182 – ident: 778_CR264 doi: 10.1145/3029798.3038319 – ident: 778_CR219 doi: 10.1109/HRI.2010.5453269 – ident: 778_CR40 – volume: 16 start-page: 608 issue: 5 year: 2010 ident: 778_CR32 publication-title: Telemed e-Health doi: 10.1089/tmj.2009.0171 – ident: 778_CR50 doi: 10.1109/HRI.2019.8673222 – volume: 19 start-page: 61 issue: 1–2 year: 2004 ident: 778_CR127 publication-title: Hum Comput Interact doi: 10.1207/s15327051hci1901&2_4 – ident: 778_CR2 doi: 10.1109/CIT.2008.Workshops.85 – volume: 13 start-page: 46 year: 2019 ident: 778_CR240 publication-title: Front Neurorobot doi: 10.3389/fnbot.2019.00046 – ident: 778_CR122 doi: 10.1109/SII.2019.8700422 – volume-title: An approach to environmental psychology year: 1974 ident: 778_CR178 – ident: 778_CR115 doi: 10.1109/IROS.2006.282327 – ident: 778_CR233 – ident: 778_CR250 – volume: 374 start-page: 1 issue: 1771 year: 2019 ident: 778_CR101 publication-title: Philos Trans R Soc B doi: 10.1098/rstb.2018.0026 – volume: 35 start-page: 137 issue: 1 year: 2005 ident: 778_CR276 publication-title: Res Sci Educ doi: 10.1007/s11165-004-3437-y – ident: 778_CR26 doi: 10.1007/3-540-45631-7_17 – volume: 37 start-page: 14 issue: 1 year: 2014 ident: 778_CR228 publication-title: J Chin Inst Eng doi: 10.1080/02533839.2012.751330 – volume: 3 start-page: 73 issue: 2 year: 2010 ident: 778_CR299 publication-title: Intel Serv Robot doi: 10.1007/s11370-010-0060-9 – ident: 778_CR255 doi: 10.1109/HRI.2019.8673172 – ident: 778_CR254 doi: 10.1109/ROMAN.2012.6343772 – ident: 778_CR188 doi: 10.1145/1514095.1514110 – volume: 6 start-page: 1 issue: 54 year: 2019 ident: 778_CR145 publication-title: Front Robot AI – ident: 778_CR19 doi: 10.1109/WACI.2011.5953147 – ident: 778_CR154 doi: 10.1109/IIH-MSP.2014.204 – ident: 778_CR148 doi: 10.1109/ROMAN.2005.1513802 – ident: 778_CR126 doi: 10.1145/2909824.3020224 – ident: 778_CR171 doi: 10.1109/CONIELECOMP.2019.8673111 – ident: 778_CR266 doi: 10.1016/j.procs.2015.08.102 – ident: 778_CR167 doi: 10.1007/11573548_122 – volume: 5 start-page: 291 issue: 2 year: 2013 ident: 778_CR158 publication-title: Int J Soc Robot doi: 10.1007/s12369-013-0178-y – volume: 5 start-page: 127 issue: 1 year: 2019 ident: 778_CR301 publication-title: IEEE Robot Automa Lett doi: 10.1109/LRA.2019.2947010 – ident: 778_CR49 doi: 10.1145/1957656.1957818 – volume-title: The cognitive structure of emotions year: 1988 ident: 778_CR198 doi: 10.1017/CBO9780511571299 – ident: 778_CR208 doi: 10.1007/s12369-021-00763-2 – ident: 778_CR165 doi: 10.1145/1957656.1957789 – ident: 778_CR209 doi: 10.1115/DSCC2015-9841 – ident: 778_CR296 doi: 10.1007/978-981-13-9406-5_15 – ident: 778_CR281 doi: 10.1109/ICEIEC.2019.8784476 – volume: 7 start-page: 9848 year: 2019 ident: 778_CR60 publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2891668 – ident: 778_CR195 doi: 10.1145/2559636.2563702 – ident: 778_CR123 doi: 10.1109/TEXCRA.2004.1424983 – ident: 778_CR211 doi: 10.1007/s12369-022-00867-0 – volume: 5 start-page: 441 issue: 4 year: 2013 ident: 778_CR76 publication-title: Int J Soc Robot doi: 10.1007/s12369-013-0200-4 – ident: 778_CR218 doi: 10.1145/1957656.1957669 – volume: 328 start-page: 0405184 issue: Suppl S5 year: 2004 ident: 778_CR71 publication-title: BMJ doi: 10.1136/sbmj.0405184 – volume: 7 start-page: 451 issue: 4 year: 2015 ident: 778_CR62 publication-title: Int J Soc Robot doi: 10.1007/s12369-015-0297-8 – volume: 42 start-page: 253 issue: 3 year: 1998 ident: 778_CR289 publication-title: J Bus Res doi: 10.1016/S0148-2963(97)00122-7 – volume: 30 start-page: 343 issue: 4 year: 2018 ident: 778_CR38 publication-title: Connect Sci doi: 10.1080/09540091.2018.1454889 – ident: 778_CR42 doi: 10.1109/ISCID.2008.170 – ident: 778_CR75 doi: 10.1109/ROMAN.2012.6343883 – volume-title: Theories of human communication year: 2010 ident: 778_CR168 – volume: 39 start-page: 1161 issue: 6 year: 1980 ident: 778_CR216 publication-title: J Pers Soc Psychol doi: 10.1037/h0077714 – ident: 778_CR180 doi: 10.1145/375735.376103 – volume: 14 start-page: 207 issue: 3 year: 2003 ident: 778_CR261 publication-title: Br J Manag doi: 10.1111/1467-8551.00375 – ident: 778_CR280 doi: 10.1109/TSMC.2019.2897330 – volume: 164 start-page: 86 issue: 3875 year: 1969 ident: 778_CR74 publication-title: Science doi: 10.1126/science.164.3875.86 – volume: 26 start-page: 897 issue: 5 year: 2010 ident: 778_CR130 publication-title: IEEE Trans Rob doi: 10.1109/TRO.2010.2062550 – volume-title: Principles of marketing management year: 1986 ident: 778_CR12 – ident: 778_CR44 doi: 10.1109/HRI.2019.8673082 – ident: 778_CR273 doi: 10.23919/SCSE.2019.8842658 – ident: 778_CR223 doi: 10.1109/SMC.2019.8914198 – volume: 10 start-page: 252 issue: 3 year: 2006 ident: 778_CR107 publication-title: Pers Soc Psychol Rev doi: 10.1207/s15327957pspr1003_4 – ident: 778_CR302 doi: 10.1109/DASC/PiCom/CBDCom/CyberSciTech.2019.00076 – ident: 778_CR85 doi: 10.1145/1228716.1228734 – volume: 29 start-page: 1216 issue: 6 year: 2015 ident: 778_CR284 publication-title: Auton Agent Multi Agent Syst doi: 10.1007/s10458-015-9307-3 – ident: 778_CR82 doi: 10.1109/HRI.2019.8673078 – volume: 37 start-page: 162 year: 2012 ident: 778_CR90 publication-title: Rev Res Soc Interv – ident: 778_CR241 doi: 10.1109/PERCOMW.2019.8730714 – volume: 6 start-page: 281 issue: 2 year: 2014 ident: 778_CR231 publication-title: Int J Soc Robot doi: 10.1007/s12369-013-0224-9 – ident: 778_CR210 doi: 10.1109/ICHR.2004.1442120 – ident: 778_CR112 doi: 10.1525/9780520951853 – ident: 778_CR46 doi: 10.1109/FG.2018.00072 – ident: 778_CR117 doi: 10.1109/HRI.2019.8673012 – volume: 3 start-page: 291 issue: 3 year: 2011 ident: 778_CR33 publication-title: Int J Soc Robot doi: 10.1007/s12369-011-0096-9 – volume: 12 start-page: 83 issue: 1 year: 2002 ident: 778_CR30 publication-title: Auton Robot doi: 10.1023/A:1013215010749 – ident: 778_CR176 doi: 10.1109/IROS.2005.1545125 – ident: 778_CR156 doi: 10.1007/s11370-019-00301-x – volume: 29 start-page: 5 issue: 1 year: 2015 ident: 778_CR25 publication-title: German J Hum Resour Manag doi: 10.1177/239700221502900101 – volume: 11 start-page: 693 issue: 5 year: 2019 ident: 778_CR294 publication-title: Int J Soc Robot doi: 10.1007/s12369-019-00603-1 – volume: 10 start-page: 163 issue: 1 year: 2018 ident: 778_CR70 publication-title: Int J Soc Robot doi: 10.1007/s12369-017-0439-2 – ident: 778_CR151 doi: 10.21437/Eurospeech.2003-80 – ident: 778_CR13 doi: 10.1007/978-3-030-42307-0_2 – ident: 778_CR72 doi: 10.1002/0470013494.ch3 – volume: 23 start-page: 962 issue: 5 year: 2007 ident: 778_CR128 publication-title: IEEE Trans Rob doi: 10.1109/TRO.2007.904904 – volume: 19 start-page: 98 issue: 2 year: 2012 ident: 778_CR184 publication-title: IEEE Robot Autom Mag doi: 10.1109/MRA.2012.2192811 – ident: 778_CR47 doi: 10.1109/WCICA.2018.8630711 – ident: 778_CR24 – ident: 778_CR98 doi: 10.1007/978-3-319-25554-5_19 – ident: 778_CR84 doi: 10.1007/3-540-36460-9_17 – ident: 778_CR141 – ident: 778_CR300 doi: 10.1007/978-3-319-92537-0_88 – volume: 5 start-page: 563 issue: 4 year: 2013 ident: 778_CR192 publication-title: Int J Soc Robot doi: 10.1007/s12369-013-0201-3 – ident: 778_CR97 – ident: 778_CR144 doi: 10.1109/ICCCyb.2013.6617620 – volume: 11 start-page: 141 issue: 1 year: 2019 ident: 778_CR286 publication-title: Int J Soc Robot doi: 10.1007/s12369-018-0483-6 – ident: 778_CR186 doi: 10.1007/978-3-319-07230-2_64 – ident: 778_CR43 – ident: 778_CR114 doi: 10.1145/2696454.2696495 – volume: 59 start-page: 119 issue: 1–2 year: 2003 ident: 778_CR29 publication-title: Int J Hum Comput Stud doi: 10.1016/S1071-5819(03)00018-1 – volume: 8 start-page: 357 issue: 4 year: 1993 ident: 778_CR9 publication-title: Adv Robot doi: 10.1163/156855394X00158 – ident: 778_CR124 doi: 10.1007/3-211-38927-X_33 – ident: 778_CR45 doi: 10.1007/978-1-4614-8280-2_18 – volume: 9 start-page: 168 issue: 1 year: 2018 ident: 778_CR174 publication-title: Paladyn J Behav Robot doi: 10.1515/pjbr-2018-0012 – ident: 778_CR290 doi: 10.1109/ICRA.2019.8793720 – volume: 9 start-page: 231 issue: 2 year: 2017 ident: 778_CR89 publication-title: Int J Soc Robot doi: 10.1007/s12369-016-0389-0 – ident: 778_CR201 – volume: 1 start-page: 149 issue: 2 year: 1971 ident: 778_CR249 publication-title: Eur J Soc Psychol doi: 10.1002/ejsp.2420010202 – volume: 40 start-page: 193 issue: 2 year: 2016 ident: 778_CR6 publication-title: Auton Robot doi: 10.1007/s10514-015-9444-1 – ident: 778_CR132 – ident: 778_CR69 doi: 10.4324/9781315660998 – ident: 778_CR270 – volume: 18 start-page: 71 issue: 1 year: 2004 ident: 778_CR59 publication-title: Cognit Emot doi: 10.1080/02699930244000444 – volume: 10 start-page: 9 issue: 1 year: 1984 ident: 778_CR260 publication-title: J Manag – ident: 778_CR275 doi: 10.1007/978-3-319-16199-0_30 – ident: 778_CR63 doi: 10.1109/VETECF.2009.5378965 – ident: 778_CR64 doi: 10.1109/ROMAN.2005.1513860 – volume: 29 start-page: 907 issue: 5 year: 2018 ident: 778_CR277 publication-title: J Serv Manag doi: 10.1108/JOSM-04-2018-0119 – ident: 778_CR1 doi: 10.1109/IRIS.2015.7451614 – ident: 778_CR157 doi: 10.1145/3173386.3177063 – volume: 10 start-page: 346 issue: 1 year: 2019 ident: 778_CR28 publication-title: Paladyn J Behav Robot doi: 10.1515/pjbr-2019-0028 – volume: 3 start-page: 48 issue: 1 year: 2003 ident: 778_CR37 publication-title: Emotion doi: 10.1037/1528-3542.3.1.48 – volume: 5 start-page: 35 issue: 3 year: 2005 ident: 778_CR131 publication-title: Kansei Eng Int doi: 10.5057/kei.5.3_35 – volume: 5 start-page: 51 issue: 1 year: 2015 ident: 778_CR125 publication-title: J Artif Intell Soft Comput Res doi: 10.1515/jaiscr-2015-0018 – ident: 778_CR215 doi: 10.1145/2157689.2157814 – volume: 163 start-page: 163 year: 1984 ident: 778_CR259 publication-title: Approach Emot – volume-title: Social identity and intergroup relations year: 1982 ident: 778_CR248 – volume: 19 start-page: 2844 issue: 13 year: 2019 ident: 778_CR96 publication-title: Sensors doi: 10.3390/s19132844 – volume: 1 start-page: 3 issue: 1 year: 1987 ident: 778_CR162 publication-title: Cogn Emot doi: 10.1080/02699938708408361 – volume: 85 start-page: 308 year: 2018 ident: 778_CR230 publication-title: Comput Hum Behav doi: 10.1016/j.chb.2018.03.043 – start-page: 67 volume-title: Human-robot interactions in future military operations year: 2016 ident: 778_CR257 – ident: 778_CR18 doi: 10.1145/1877826.1877837 – ident: 778_CR79 doi: 10.1145/1349822.1349856 – volume: 76 start-page: 59 year: 2017 ident: 778_CR256 publication-title: Comput Hum Behav doi: 10.1016/j.chb.2017.06.036 – ident: 778_CR262 doi: 10.1109/HUMANOIDS.2012.6651536 – ident: 778_CR187 doi: 10.1007/978-3-642-03983-6_10 – ident: 778_CR134 doi: 10.1109/ICCES45898.2019.9002175 – ident: 778_CR242 – volume: 4 start-page: 668 issue: 4 year: 2017 ident: 778_CR169 publication-title: IEEC/CAA J Automat Sinica doi: 10.1109/JAS.2017.7510622 – ident: 778_CR295 doi: 10.1109/ROMAN.2009.5326184 – ident: 778_CR55 doi: 10.1109/ETCM.2017.8247472 – ident: 778_CR137 – ident: 778_CR118 doi: 10.1007/978-3-030-30036-4_26 – ident: 778_CR239 doi: 10.24251/HICSS.2018.133 – volume: 16 start-page: 172988141881797 issue: 1 year: 2019 ident: 778_CR214 publication-title: Int J Adv Rob Syst doi: 10.1177/1729881418817972 – volume-title: Human groups and social categories: studies in social psychology year: 1981 ident: 778_CR247 – volume: 60 start-page: 252 issue: 2 year: 1994 ident: 778_CR197 publication-title: Organ Behav Hum Decis Process doi: 10.1006/obhd.1994.1083 – start-page: 1 volume-title: Perception of emotion in self and others year: 1979 ident: 778_CR160 – volume: 87 start-page: 663 issue: 4 year: 1996 ident: 778_CR204 publication-title: Br J Psychol doi: 10.1111/j.2044-8295.1996.tb02615.x – ident: 778_CR159 doi: 10.1145/2696454.2696481 – ident: 778_CR244 doi: 10.1145/1514095.1514106 – ident: 778_CR236 – ident: 778_CR14 doi: 10.1145/782896.782911 – volume: 2 start-page: 1 issue: 3 year: 2018 ident: 778_CR291 publication-title: Int J Robot Eng (IJRE) – volume: 88 start-page: 879 issue: 5 year: 2003 ident: 778_CR206 publication-title: J Appl Psychol doi: 10.1037/0021-9010.88.5.879 – ident: 778_CR16 doi: 10.1109/ROMAN.2010.5598649 – ident: 778_CR229 doi: 10.1145/2909824.3020239 – ident: 778_CR116 doi: 10.24251/HICSS.2018.559 – ident: 778_CR129 doi: 10.1145/1514095.1514127 – ident: 778_CR150 doi: 10.1109/ARSO.2010.5679999 – volume: 2 start-page: 33 issue: 2 year: 2008 ident: 778_CR109 publication-title: J Phys Agents – ident: 778_CR234 doi: 10.24251/HICSS.2019.234 – volume: 29 start-page: 143 issue: 2 year: 1982 ident: 778_CR10 publication-title: Organ Behav Hum Perform doi: 10.1016/0030-5073(82)90254-9 – ident: 778_CR99 doi: 10.1109/SMC.2019.8914039 – ident: 778_CR190 – ident: 778_CR199 doi: 10.31234/osf.io/ubq34 – volume: 88 start-page: 103356 year: 2020 ident: 778_CR271 publication-title: Eng Appl Artif Intell doi: 10.1016/j.engappai.2019.103356 – start-page: 25 volume-title: Review of personality and social psychology: emotion year: 1992 ident: 778_CR152 – volume: 38 start-page: 729 issue: 3 year: 2014 ident: 778_CR34 publication-title: MIS Q doi: 10.25300/MISQ/2014/38.3.05 – ident: 778_CR103 – ident: 778_CR136 doi: 10.1109/ROMAN.2009.5326282 – volume: 11 start-page: 555 issue: 4 year: 2019 ident: 778_CR265 publication-title: Int J Soc Robot doi: 10.1007/s12369-019-00524-z – ident: 778_CR73 doi: 10.1037/t27734-000 – volume-title: Circumplex models of personality and emotions year: 1997 ident: 778_CR205 doi: 10.1037/10261-000 – volume: 33 start-page: 15 issue: 6 year: 2019 ident: 778_CR166 publication-title: IEEE Netw doi: 10.1109/MNET.001.1900070 – volume: 58 start-page: 34 issue: 1 year: 1982 ident: 778_CR66 publication-title: J Retail – volume: 1 start-page: 3 issue: 1 year: 2009 ident: 778_CR147 publication-title: Int J Soc Robot doi: 10.1007/s12369-008-0009-8 – volume: 7 start-page: 117 issue: 2 year: 1954 ident: 778_CR81 publication-title: Hum Relat doi: 10.1177/001872675400700202 – ident: 778_CR39 doi: 10.1145/267658.267688 – volume: 6 start-page: 367 issue: 3 year: 2014 ident: 778_CR22 publication-title: Int J Soc Robot doi: 10.1007/s12369-014-0237-z – ident: 778_CR139 – volume: 4 start-page: 20 issue: 2 year: 2015 ident: 778_CR212 publication-title: Int J Adv Res Artif Intell (IJARAI) – volume: 17 start-page: 297 issue: 3 year: 2013 ident: 778_CR67 publication-title: Rev Gen Psychol doi: 10.1037/a0032947 – volume: 31 start-page: 535 issue: 3 year: 2014 ident: 778_CR235 publication-title: J Prod Innov Manag doi: 10.1111/jpim.12112 – volume: 4 start-page: 852 issue: 7 year: 2010 ident: 778_CR251 publication-title: J Adv Comput Intell Intell Inform doi: 10.20965/jaciii.2010.p0852 – volume: 1 start-page: 311 issue: 3 year: 1997 ident: 778_CR15 publication-title: Rev Gen Psychol doi: 10.1037/1089-2680.1.3.311 – start-page: 271 volume-title: Who needs emotions? The brain meets the robot year: 2005 ident: 778_CR31 doi: 10.1093/acprof:oso/9780195166194.003.0010 – ident: 778_CR104 doi: 10.1007/978-3-319-11973-1_17 – volume: 59 start-page: 411 year: 2016 ident: 778_CR86 publication-title: Comput Hum Behav doi: 10.1016/j.chb.2016.02.043 – ident: 778_CR113 doi: 10.1145/2559636.2559660 – ident: 778_CR200 doi: 10.1007/978-3-030-23522-2_38 – volume: 5 start-page: 325 issue: 3 year: 2013 ident: 778_CR17 publication-title: Int J Soc Robot doi: 10.1007/s12369-013-0193-z – volume: 2 start-page: 13 issue: 1 year: 2019 ident: 778_CR54 publication-title: Data Sci Appl – volume: 1 start-page: 173 issue: S1 year: 1969 ident: 778_CR246 publication-title: J Biosoc Sci doi: 10.1017/S0021932000023336 – volume: 21 start-page: 153 issue: 2 year: 2007 ident: 778_CR163 publication-title: J Econ Perspect doi: 10.1257/jep.21.2.153 – volume: 11 start-page: 353 issue: 2 year: 2008 ident: 778_CR133 publication-title: Organ Res Methods doi: 10.1177/1094428107308978 – ident: 778_CR5 doi: 10.1109/HRI.2013.6483606 – volume: 10 start-page: 473 issue: 4 year: 2018 ident: 778_CR7 publication-title: Int J Soc Robot doi: 10.1007/s12369-017-0446-3 – ident: 778_CR175 doi: 10.1109/ACII.2019.8925459 – ident: 778_CR258 doi: 10.1145/2559636.2559663 – ident: 778_CR11 doi: 10.1007/978-3-030-36150-1_49 – ident: 778_CR283 – volume-title: Cognitive psychology: a student’s handbook year: 2015 ident: 778_CR80 doi: 10.4324/9781315778006 – ident: 778_CR88 – volume: 60 start-page: 263 issue: 2 year: 2010 ident: 778_CR138 publication-title: J Intell Robot Syst doi: 10.1007/s10846-010-9418-7 – ident: 778_CR120 doi: 10.1109/ROMAN.2007.4415195 – volume: 9 start-page: 35 issue: 2 year: 2012 ident: 778_CR202 publication-title: Int J Adv Rob Syst doi: 10.5772/50228 – volume: 7 start-page: 234 issue: 1 year: 2019 ident: 778_CR78 publication-title: IEEE Trans Comput Soc Syst doi: 10.1109/TCSS.2019.2922593 – volume: 13 start-page: 15549 issue: 11 year: 2013 ident: 778_CR4 publication-title: Sensors doi: 10.3390/s131115549 – ident: 778_CR191 doi: 10.1145/3029798.3038333 – ident: 778_CR143 – volume: 40 start-page: 5160 issue: 13 year: 2013 ident: 778_CR297 publication-title: Expert Syst Appl doi: 10.1016/j.eswa.2013.03.016 – ident: 778_CR110 – volume: 33 start-page: 536 issue: 4 year: 2005 ident: 778_CR237 publication-title: J Acad Mark Sci doi: 10.1177/0092070305276368 – ident: 778_CR213 doi: 10.1145/2157689.2157764 – volume: 11 start-page: 117 issue: 5 year: 2018 ident: 778_CR51 publication-title: Int J Control Autom doi: 10.14257/ijca.2018.11.5.11 – volume: 82 start-page: 101 issue: 1 year: 2016 ident: 778_CR177 publication-title: J Intell Robot Syst doi: 10.1007/s10846-015-0259-2 – ident: 778_CR221 doi: 10.1145/1121241.1121281 – ident: 778_CR119 doi: 10.1177/1541931214581420 – volume: 55 start-page: 171 issue: 3 year: 1987 ident: 778_CR303 publication-title: J Exp Educ doi: 10.1080/00220973.1987.10806451 – volume: 509 start-page: 150 year: 2020 ident: 778_CR48 publication-title: Inf Sci doi: 10.1016/j.ins.2019.09.005 – volume: 9 start-page: 277 issue: 2 year: 2017 ident: 778_CR53 publication-title: Int J Soc Robot doi: 10.1007/s12369-016-0387-2 – volume: 7 start-page: 33 issue: 4 year: 1970 ident: 778_CR185 publication-title: Energy – ident: 778_CR194 – volume: 104 start-page: 17954 issue: 46 year: 2007 ident: 778_CR252 publication-title: Proc Natl Acad Sci doi: 10.1073/pnas.0707769104 – volume: 38 start-page: 785 issue: 3 year: 2008 ident: 778_CR87 publication-title: IEEE Trans Syst Man Cybern Part B (Cybern) doi: 10.1109/TSMCB.2008.920227 – volume: 2 start-page: 96 issue: 3 year: 1993 ident: 778_CR108 publication-title: Curr Dir Psychol Sci doi: 10.1111/1467-8721.ep10770953 – ident: 778_CR183 doi: 10.1145/2909824.3020216 – volume: 7 start-page: 297 issue: 3 year: 2006 ident: 778_CR172 publication-title: Interact Stud doi: 10.1075/is.7.3.03mac – ident: 778_CR238 doi: 10.1109/PERCOMW.2017.7917585 – volume: 86 start-page: 1 year: 2020 ident: 778_CR153 publication-title: Int J Soc Robot – volume: 58 start-page: 322 issue: 3 year: 2010 ident: 778_CR142 publication-title: Robot Auton Syst doi: 10.1016/j.robot.2009.09.015 – ident: 778_CR293 – ident: 778_CR8 doi: 10.1109/SII.2019.8700376 – ident: 778_CR21 doi: 10.1007/978-3-030-19591-5_18 – ident: 778_CR173 doi: 10.1109/SCIS-ISIS.2018.00044 – volume: 14 start-page: 317 issue: 3 year: 2009 ident: 778_CR135 publication-title: IEEE/ASME Trans Mechatron doi: 10.1109/TMECH.2008.2008644 – volume: 7 start-page: 279 issue: 2 year: 2015 ident: 778_CR285 publication-title: Int J Soc Robot doi: 10.1007/s12369-014-0248-9 – ident: 778_CR243 – ident: 778_CR193 doi: 10.1109/ROMAN.2004.1374726 – volume: 10 start-page: 147 issue: 1 year: 2018 ident: 778_CR222 publication-title: Int J Soc Robot doi: 10.1007/s12369-017-0441-8 – volume: 26 start-page: 261 issue: 2 year: 2002 ident: 778_CR100 publication-title: Camb J Econ doi: 10.1093/cje/26.2.261 – volume: 7 start-page: 41273 year: 2019 ident: 778_CR140 publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2907327 – volume: 17 start-page: 688 issue: 4 year: 2006 ident: 778_CR217 publication-title: Behav Ecol doi: 10.1093/beheco/ark016 – volume: 6 start-page: 90 issue: 1 year: 2011 ident: 778_CR68 publication-title: Soc Cognit Affect Neurosci doi: 10.1093/scan/nsq019 – ident: 778_CR279 doi: 10.1109/ROMAN.2004.1374728 – ident: 778_CR232 doi: 10.1109/HICSS.2016.273 – volume: 50 start-page: 992 issue: 5 year: 1986 ident: 778_CR83 publication-title: J Pers Soc Psychol doi: 10.1037/0022-3514.50.5.992 – ident: 778_CR245 doi: 10.21437/Interspeech.2011-781 – ident: 778_CR111 doi: 10.1109/ICHR.2006.321363 – ident: 778_CR106 doi: 10.1109/SICE.2006.315537 – ident: 778_CR102 doi: 10.1109/CogInfoCom.2012.6421937 |
SSID | ssj0000327776 |
Score | 2.5710661 |
Snippet | Knowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued... |
SourceID | proquest crossref springer |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 389 |
SubjectTerms | Control Emotions Engineering Mechatronics Moderators Psychology Recognition Robotics Robots State-of-the-art reviews |
SummonAdditionalLinks | – databaseName: SpringerLink Journals (ICM) dbid: U2A link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NSsQwEA66XvQg_uLqKjl400KbpknrbZFdFg8i6sJ6KvmZwoJ0ZdsVvPkOvqFPYpK2WxUVPDedw0yS-WYy3wxCpzrOaBJo5YkkABOgxOAJbs5jpDlLgMTGzboC2Ws2GtOrSTSpSWFFU-3ePEm6m7olu5GQJZ4tKbA9aGKPraK1yMbuZhePSX-ZWfFDwrmbKhcYMG5TR0HNlvlZzFeP1MLMby-jzuEMt9BmjRRxvzLtNlqBfAdtfOofuIvyu8X8GV7wLMODahxPgac5don599e325mcldjl_Cr6QnGBb1pyZYEttwS7VVOFl1ehkZdj4uMHcwgKK7upzttD4-Hg_nLk1QMUPGXcTOlpFkkAypROJBEs0mFGiJCaUAhsn60QKPgiE1ooKqnyE19JnQWMS5npKBbhPurksxwOEI4FJ9JAMR4roAS0ADB-LsjMFjCQR_AuCholpqruLm6HXDymbV9kq_jUKD51ik9ZF50t_3mqemv8ubrX2Catz1mRWjgXmYiNky46b-zVfv5d2uH_lh-hdWJ5D674rIc65XwBxwaNlPLEbb4PWQ3XNw priority: 102 providerName: Springer Nature |
Title | Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of Research |
URI | https://link.springer.com/article/10.1007/s12369-021-00778-6 https://www.proquest.com/docview/2644596672 |
Volume | 14 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB7R7qUcEC0glpaVD9wgInES2-GClmW3FY-qalmpPUV-RaqEktJsK_XGf-Af8ksYO86mrUQvycHOHDKvz-N5ALwxosqKxOhIFonFA4qwkeSoj7nhrLBUoJv1CbKH7GCZfTnNT0PArQ1plb1N9IbaNNrFyN87x50jNuf048WvyE2NcrerYYTGBozQBAs8fI0-zQ-PjtdRljilnPsJcwkCcxdGSkLlTFc_R1NWRC5LwbW1ERG7650GyHnvltQ7n8VTeBJQI5l2bN6GR7begce3egnuwNbalN08g_rk6vLa3pCmIvNuTk9LzmviI_Z_f_85blSzIj4Y2NU1tB_I0VB12RJXdEL8rnNNBsKkqQmNyRlqR-to92l7z2G5mP-YHURhskKk0f-sIsNyZW3GtCkUlSw3aUWpVIZmNnENuFKb2VhW0kidqUzHRayVqRLGlapMLmT6AjbrprYvgQjJqUKMxoW2GbVGWosOMKlQNhALST6GpP-jpQ5tx930i5_l0DDZcaFELpSeCyUbw9v1Nxdd040Hd-_1jCqDArblIC5jeNczb1j-P7VXD1PbhS3qCiB8FtoebK4ur-xrhCUrNYENsdifwGj6-fu3E_feP_s6nwSJxNUZm-FzSaf_ABNJ5dg |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3NTtVAFD5BXAgLo4jxIuoscIWN7XQ605oQYpTLRX5iBBJc1fk5TUhIC_QCuTvfgffgoXwSZ6btrZLAjnWnZzHnzPn_zgFYMWnBssjoQGYR2gAlxUAK-x4TI3iGNLVm1jfI7vHRIft2lBzNwE2HhXFtlZ1O9IraVNrlyD86w51Y31zQ9dOzwG2NctXVboVGIxbbOLmyIVu9tvXV8vc9pcONgy-joN0qEGire8eB4YlCZFybTFHJExMXlEplKMPIDZ-KkWEoC2mkZorpMAu1MkXEhVKFSVIZW7qP4DGL48y9qHS4Oc3phDEVwu-zi2wY4JJWUYvTadB6NOZZ4Hoi3BCdNOD_28Lewb1Vk_WmbvgMnrY-KvncCNVzmMFyAeb_mVy4AHNTxTl5AeX-xfklTkhVkI1mK1BNjkvi6wN_fl__qFQ1Jj712KAo6k_ke4_xrImDuBB_6liTnjCpSkJD8tNeeu1od02Ci3D4IDf-EmbLqsRXQFIpqLIeoUg1MopGIlpzGxVWEq3nJcUAou5Gc90OOXe7Nk7yfjyz40JuuZB7LuR8AKvTf06bER_3nl7uGJW3z73Oe-EcwIeOef3nu6kt3U_tHTwZHezu5Dtbe9uvYY466IXvf1uG2fH5Bb6xDtFYvfVSSODXQ4v9X3aXHnU |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3LSsNAFB20guhCfGK16izcaWgySWYSd1Jb6oNS1EJdhXkFCpKWJhXc-Q_-oV_izCRpqqjgOpO7uPO4Z-7ccy4ApyKIvdAR3KKhI9UFJZAWJWo_-oLgUKJAhVlTINvD3YF3M_SHCyx-U-1ePknmnAat0pRkzYmImxXxDbk4tHR5gdajCSy8DFbUTcU81LZwa55lsV1EiOkw5yhgrtNITsGc-dnM1-hUQc5vr6Qm-HQ2wUaBGuFlPs1bYEkm22B9QUtwByQPs-mLfIXjGLbz1jwpHCXQJOk_3t7vx2ycQZP_y6kM6QXsV0TLFGqeCTSjRhzOj0VlL4HIhk9qQ6TadlmptwsGnfZjq2sVzRQsrkJOZgnsMyk9zEXIEMW-cGOEKBPIk47W3HKlJ20aU0G5xzxuhzZnInYwYSwWfkDdPVBLxoncBzCgBDEFy0jApYekoFKqmOfEajko-ENJHTilEyNeKI3rhhfPUaWRrB0fKcdHxvERroOz-T-TXGfjz9GNcm6iYs-lkYZ2vrq9EVQH5-V8VZ9_t3bwv-EnYLV_1Ynurnu3h2ANaTqEqUlrgFo2nckjBVIydmzW4Sc0Xt5Q |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Survey+of+Emotions+in+Human%E2%80%93Robot+Interactions%3A+Perspectives+from+Robotic+Psychology+on+20+Years+of+Research&rft.jtitle=International+journal+of+social+robotics&rft.au=Stock-Homburg%2C+Ruth&rft.date=2022-03-01&rft.pub=Springer+Nature+B.V&rft.issn=1875-4791&rft.eissn=1875-4805&rft.volume=14&rft.issue=2&rft.spage=389&rft.epage=411&rft_id=info:doi/10.1007%2Fs12369-021-00778-6 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1875-4791&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1875-4791&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1875-4791&client=summon |