Communication and knowledge sharing in human–robot interaction and learning from demonstration
Inexpensive personal robots will soon become available to a large portion of the population. Currently, most consumer robots are relatively simple single-purpose machines or toys. In order to be cost effective and thus widely accepted, robots will need to be able to accomplish a wide range of tasks...
Saved in:
Published in | Neural networks Vol. 23; no. 8; pp. 1104 - 1112 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Ltd
01.10.2010
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Inexpensive personal robots will soon become available to a large portion of the population. Currently, most consumer robots are relatively simple single-purpose machines or toys. In order to be cost effective and thus widely accepted, robots will need to be able to accomplish a wide range of tasks in diverse conditions. Learning these tasks from demonstrations offers a convenient mechanism to customize and train a robot by transferring task related knowledge from a user to a robot. This avoids the time-consuming and complex process of manual programming. The way in which the user interacts with a robot during a demonstration plays a vital role in terms of how effectively and accurately the user is able to provide a demonstration. Teaching through demonstrations is a social activity, one that requires bidirectional communication between a teacher and a student. The work described in this paper studies how the user’s visual observation of the robot and the robot’s auditory cues affect the user’s ability to teach the robot in a social setting. Results show that auditory cues provide important knowledge about the robot’s internal state, while visual observation of a robot can hinder an instructor due to incorrect mental models of the robot and distractions from the robot’s movements. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2010.06.005 |