Artificial Empathy in Social Robots: An analysis of Emotions in Speech

Artificial speech developed using speech synthesizers has been used as the voice for robots in Human Robot Interaction (HRI). As humans anthropomorphize robots, an empathetically interacting robot is expected to increase the level of acceptance of social robots. Here, a human perception experiment e...

Full description

Saved in:
Bibliographic Details
Published in2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) pp. 632 - 637
Main Authors James, Jesin, Watson, Catherine Inez, MacDonald, Bruce
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Artificial speech developed using speech synthesizers has been used as the voice for robots in Human Robot Interaction (HRI). As humans anthropomorphize robots, an empathetically interacting robot is expected to increase the level of acceptance of social robots. Here, a human perception experiment evaluates whether human subjects perceive empathy in robot speech. For this experiment, empathy is expressed only by adding appropriate emotions to the words in speech. Also, humans' preferences for a robot interacting with empathetic speech versus a standard robotic voice are also assessed. The results show that humans are able to perceive empathy and emotions in robot speech, and prefer it over the standard robotic voice. It is important for the emotions in empathetic speech to be consistent with the language content of what is being said, and with the human users' emotional state. Analyzing emotions in empathetic speech using valence-arousal model has revealed the importance of secondary emotions in developing empathetically speaking social robots.
ISSN:1944-9437
DOI:10.1109/ROMAN.2018.8525652