Using human gaze in few-shot imitation learning for robot manipulation

Imitation learning has attracted attention as a method for realizing complex robot control without programmed robot behavior. Meta-imitation learning has been proposed to solve the high cost of data collection and low generalizability to new tasks that imitation learning suffers from. Meta-imitation...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) pp. 8622 - 8629
Main Authors Hamano, Shogo, Kim, Heecheol, Ohmura, Yoshiyuki, Kuniyoshi, Yasuo
Format Conference Proceeding
LanguageEnglish
Published IEEE 23.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Imitation learning has attracted attention as a method for realizing complex robot control without programmed robot behavior. Meta-imitation learning has been proposed to solve the high cost of data collection and low generalizability to new tasks that imitation learning suffers from. Meta-imitation can learn new tasks involving unknown objects from a small amount of data by learning multiple tasks during training. However, meta-imitation learning, especially using images, is still vulnerable to changes in the background, which occupies a large portion of the input image. This study introduces a human gaze into meta-imitation learning-based robot control. We created a model with model-agnostic meta-learning to predict the gaze position from the image by measuring the gaze with an eye tracker in the head-mounted display. Using images around the predicted gaze position as an input makes the model robust to changes in visual information. We experimentally verified the performance of the proposed method through picking tasks using a simulated robot. The results indicate that our proposed method has a greater ability than the conventional method to learn a new task from only 9 demonstrations even if the object's color or the background pattern changes between the training and test.
ISSN:2153-0866
DOI:10.1109/IROS47612.2022.9981706