How the Timing and Magnitude of Robot Errors Influence Peoples' Trust of Robots in an Emergency Scenario

Trust is a key factor in human users’ acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic ro...

Full description

Saved in:
Bibliographic Details
Published inSocial Robotics Vol. 10652; pp. 42 - 52
Main Authors Rossi, Alessandra, Dautenhahn, Kerstin, Koay, Kheng Lee, Walters, Michael L.
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Trust is a key factor in human users’ acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic robot should have acceptable interactive behaviours when exhibiting and recovering from an error situation. In order to define these behaviours, it is firstly necessary to consider that errors can have different degrees of consequences. We hypothesise that the severity of the consequences and the timing of a robot’s different types of erroneous behaviours during an interaction may have different impacts on users’ attitudes towards a domestic robot. In this study we used an interactive storyboard presenting ten different scenarios in which a robot performed different tasks under five different conditions. Each condition included the ten different tasks performed by the robot, either correctly, or with small or big errors. The conditions with errors were complemented with four correct behaviours. At the end of each experimental condition, participants were presented with an emergency scenario to evaluate their current trust in the robot. We conclude that there is correlation between the magnitude of an error performed by the robot and the corresponding loss of trust of the human in the robot.
ISBN:9783319700212
3319700219
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-70022-9_5