Studying Effects of Incorporating Automated Affect Perception with Spoken Dialog in Social Robots
Social robots are becoming an integrated part of our daily lives with the goal of understanding humans' social intentions and feelings, a capability which is often referred to as empathy. Despite significant progress towards the development of empathic social agents, current social robots have...
Saved in:
Published in | 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) pp. 783 - 789 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.08.2018
|
Subjects | |
Online Access | Get full text |
ISSN | 1944-9437 |
DOI | 10.1109/ROMAN.2018.8525777 |
Cover
Abstract | Social robots are becoming an integrated part of our daily lives with the goal of understanding humans' social intentions and feelings, a capability which is often referred to as empathy. Despite significant progress towards the development of empathic social agents, current social robots have yet to reach the full emotional and social capabilities. This paper presents our recent effort on incorporating an automated Facial Expression Recognition (FER) system based on deep neural networks into the spoken dialog of a social robot (Ryan) to extend and enrich its capabilities beyond spoken dialog and integrate the user's affect state into the robot's responses. In order to evaluate whether this incorporation can improve social capabilities of Ryan, we conducted a series of Human-Robot-Interaction (HRI) experiments. In these experiments the subjects watched some videos and Ryan engaged them in a conversation driven by user's facial expressions perceived by the robot. We measured the accuracy of the automated FER system on the robot when interacting with different human subjects as well as three social/interactive aspects, namely task engagement, empathy, and likability of the robot. The results of our HRI study indicate that the subjects rated empathy and likability of the affect-aware Ryan significantly higher than non-empathic (the control condition) Ryan. Interestingly, we found that the accuracy of the FER system is not a limiting factor, as subjects rated the affect-aware agent equipped with a low accuracy FER system as empathic and likable as when facial expression was recognized by a human observer. |
---|---|
AbstractList | Social robots are becoming an integrated part of our daily lives with the goal of understanding humans' social intentions and feelings, a capability which is often referred to as empathy. Despite significant progress towards the development of empathic social agents, current social robots have yet to reach the full emotional and social capabilities. This paper presents our recent effort on incorporating an automated Facial Expression Recognition (FER) system based on deep neural networks into the spoken dialog of a social robot (Ryan) to extend and enrich its capabilities beyond spoken dialog and integrate the user's affect state into the robot's responses. In order to evaluate whether this incorporation can improve social capabilities of Ryan, we conducted a series of Human-Robot-Interaction (HRI) experiments. In these experiments the subjects watched some videos and Ryan engaged them in a conversation driven by user's facial expressions perceived by the robot. We measured the accuracy of the automated FER system on the robot when interacting with different human subjects as well as three social/interactive aspects, namely task engagement, empathy, and likability of the robot. The results of our HRI study indicate that the subjects rated empathy and likability of the affect-aware Ryan significantly higher than non-empathic (the control condition) Ryan. Interestingly, we found that the accuracy of the FER system is not a limiting factor, as subjects rated the affect-aware agent equipped with a low accuracy FER system as empathic and likable as when facial expression was recognized by a human observer. |
Author | Abdollahi, Hojjat Mahoor, Mohammad H Mollahosseini, Ali |
Author_xml | – sequence: 1 givenname: Ali surname: Mollahosseini fullname: Mollahosseini, Ali organization: Department of Electrical and Computer Engineering, University of Denver, CO, USA – sequence: 2 givenname: Hojjat surname: Abdollahi fullname: Abdollahi, Hojjat organization: Department of Electrical and Computer Engineering, University of Denver, CO, USA – sequence: 3 givenname: Mohammad H surname: Mahoor fullname: Mahoor, Mohammad H organization: Department of Electrical and Computer Engineering, University of Denver, CO, USA |
BookMark | eNotkMtOAjEARavRREB-QDf9gRn7fiwniEqCYkDXpNNpsQrtZKbE8Peisro3OSd3cYfgIqboALjBqMQY6bvl4rl6KQnCqlSccCnlGRhiTpWQWiF9DgZYM1ZoRuUVGPf9J0KICMU5RwNgVnnfHELcwKn3zuYeJg9n0aauTZ3Jv6Da57Qz2TWw-lPgq-usa3NIEX6H_AFXbfpyEd4Hs00bGCJcJXvscJnqlPtrcOnNtnfjU47A-8P0bfJUzBePs0k1LwKWPBeKCaa89RI3hhOFasUc1aKR5ghsLQQhBFGDGMGSelRbLTStpVOGems4pSNw-78bnHPrtgs70x3Wp0foDzj2VzQ |
ContentType | Conference Proceeding |
DBID | 6IE 6IL CBEJK RIE RIL |
DOI | 10.1109/ROMAN.2018.8525777 |
DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library (IEL) IEEE Proceedings Order Plans (POP All) 1998-Present |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Xplore url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISBN | 1538679809 9781538679807 |
EISSN | 1944-9437 |
EndPage | 789 |
ExternalDocumentID | 8525777 |
Genre | orig-research |
GroupedDBID | 6IE 6IF 6IL 6IN AAJGR AAWTH ABLEC ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CBEJK IEGSK OCL RIE RIL |
ID | FETCH-LOGICAL-i175t-84648fcf71da5280b84e396d7a464cb6622203a042173f0bc9693b7e8a3fca533 |
IEDL.DBID | RIE |
IngestDate | Wed Aug 27 02:52:02 EDT 2025 |
IsPeerReviewed | false |
IsScholarly | true |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-i175t-84648fcf71da5280b84e396d7a464cb6622203a042173f0bc9693b7e8a3fca533 |
PageCount | 7 |
ParticipantIDs | ieee_primary_8525777 |
PublicationCentury | 2000 |
PublicationDate | 2018-Aug. |
PublicationDateYYYYMMDD | 2018-08-01 |
PublicationDate_xml | – month: 08 year: 2018 text: 2018-Aug. |
PublicationDecade | 2010 |
PublicationTitle | 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) |
PublicationTitleAbbrev | ROMAN |
PublicationYear | 2018 |
Publisher | IEEE |
Publisher_xml | – name: IEEE |
SSID | ssj0002685550 ssj0000941175 |
Score | 2.124155 |
Snippet | Social robots are becoming an integrated part of our daily lives with the goal of understanding humans' social intentions and feelings, a capability which is... |
SourceID | ieee |
SourceType | Publisher |
StartPage | 783 |
SubjectTerms | Face recognition Mirrors Observers Robots Speech recognition Task analysis Videos |
Title | Studying Effects of Incorporating Automated Affect Perception with Spoken Dialog in Social Robots |
URI | https://ieeexplore.ieee.org/document/8525777 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3JbsIwEB0Bp_bSBaru8qHHJmTDdo6oLaKVoIgWiRuKnYmEkBIEyaVfX08C6aIeekviKLI8juZ5_N4zwF1gskTM0bWUWXxZgU6kJV2tLB0IDLVypSq5OaMxH86Cl3lv3oD7WguDiCX5DG26LPfy40wXVCrrSrLuFKIJTTPNKq1WXU8xyxR3nwrp3uOyZ9D3XifjhN3p66g_JjKXtHcf-nGiSplQBkcw2nel4pGs7CJXtv745dL4374eQ-dLuscmdVI6gQamp3D4zXWwDRFxB0ndxCrr4i3LEvZMfpalpzE19Is8M1AWY9YvX2GTmv_CqHLL3tbZClP2uKTaD1umrJL5smmmsnzbgdng6f1haO1OWrCWZsxyy4CQQCY6EW4c9TzpKBmgH_JYRKZBK84NinD8yPzgrvATR-mQh74SKCM_0ZFBjGfQSrMUz4FJxESEnheRz0_sKxXSRp4JhReQESK_gDYN1mJdmWksduN0-ffjKziggFWMu2to5ZsCbwwKyNVtGf5PlKawBg |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1bT8IwFD5BfFBfvIDxbh98dIPd2u6RqASUIUFIeCNr1yWEZCOyvfjr7dlgXuKDb9u6LM1pl_P19Pu-Aty5OktEVFmG0Isvw5UxN7glhSFdpnwpLC4Kbk4wpL2p-zzzZjW4r7QwSqmCfKZMvCz28qNU5lgqa3G07mRsB3Z13ne9Uq1VVVT0QsXaJkO8tyn3NP7eKmXafmv8GnSGSOfi5uZTP85UKVJK9xCCbWdKJsnSzDNhyo9fPo3_7e0RNL_Ee2RUpaVjqKnkBA6--Q42IET2IOqbSGlevCZpTProaFm4GmNDJ89SDWZVRDrFK2RUMWAI1m7J2ypdqoQ8LrD6QxYJKYW-ZJyKNFs3Ydp9mjz0jM1ZC8ZCxywzNAxxeSxjZkWhZ_O24K5yfBqxUDdIQanGEW0n1L-4xZy4LaRPfUcwxUMnlqHGjKdQT9JEnQHhSsXMt-0QnX4iRwgft_L0UNguWiHSc2hgsOar0k5jvonTxd-Pb2GvNwkG80F_-HIJ-zh4Jf_uCurZe66uNSbIxE0xFT4B_ImzUw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2018+27th+IEEE+International+Symposium+on+Robot+and+Human+Interactive+Communication+%28RO-MAN%29&rft.atitle=Studying+Effects+of+Incorporating+Automated+Affect+Perception+with+Spoken+Dialog+in+Social+Robots&rft.au=Mollahosseini%2C+Ali&rft.au=Abdollahi%2C+Hojjat&rft.au=Mahoor%2C+Mohammad+H&rft.date=2018-08-01&rft.pub=IEEE&rft.eissn=1944-9437&rft.spage=783&rft.epage=789&rft_id=info:doi/10.1109%2FROMAN.2018.8525777&rft.externalDocID=8525777 |