Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots
We introduce the conceptual formulation, design, fabrication, control and commercial translation with IoT connection of a hybrid-face social robot and validation of human emotional response to its affective interactions. The hybrid-face robot integrates a 3D printed faceplate and a digital display t...
Saved in:
Main Authors | , , , , , , , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
08.12.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We introduce the conceptual formulation, design, fabrication, control and
commercial translation with IoT connection of a hybrid-face social robot and
validation of human emotional response to its affective interactions. The
hybrid-face robot integrates a 3D printed faceplate and a digital display to
simplify conveyance of complex facial movements while providing the impression
of three-dimensional depth for natural interaction. We map the space of
potential emotions of the robot to specific facial feature parameters and
characterise the recognisability of the humanoid hybrid-face robot's archetypal
facial expressions. We introduce pupil dilation as an additional degree of
freedom for conveyance of emotive states. Human interaction experiments
demonstrate the ability to effectively convey emotion from the hybrid-robot
face to human observers by mapping their neurophysiological
electroencephalography (EEG) response to perceived emotional information and
through interviews. Results show main hybrid-face robotic expressions can be
discriminated with recognition rates above 80% and invoke human emotive
response similar to that of actual human faces as measured by the face-specific
N170 event-related potentials in EEG. The hybrid-face robot concept has been
modified, implemented, and released in the commercial IoT robotic platform Miko
(My Companion), an affective robot with facial and conversational features
currently in use for human-robot interaction in children by Emotix Inc. We
demonstrate that human EEG responses to Miko emotions are comparative to
neurophysiological responses for actual human facial recognition. Finally,
interviews show above 90% expression recognition rates in our commercial robot.
We conclude that simplified hybrid-face abstraction conveys emotions
effectively and enhances human-robot interaction. |
---|---|
DOI: | 10.48550/arxiv.2012.04511 |