Trajectory learning from human demonstrations via manifold mapping

This work proposes a framework that enables arbitrary robots with unknown kinematics models to imitate human demonstrations to acquire a skill, and reproduce it in real-time. The diversity of robots active in non-laboratory environments is growing constantly, and to this end we present an approach f...

Full description

Saved in:
Bibliographic Details
Published in2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) pp. 3935 - 3940
Main Authors Hiratsuka, Michihisa, Makondo, Ndivhuwo, Rosman, Benjamin, Hasegawa, Osamu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This work proposes a framework that enables arbitrary robots with unknown kinematics models to imitate human demonstrations to acquire a skill, and reproduce it in real-time. The diversity of robots active in non-laboratory environments is growing constantly, and to this end we present an approach for users to be able to easily teach a skill to a robot with any body configuration. Our proposed method requires a motion trajectory obtained from human demonstrations via a Kinect sensor, which is then projected onto a corresponding human skeleton model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is modeled using Dynamic Movement Primitives, allowing it to be reproduced in real time. Experiments in simulation on a 4 degree of freedom robot show that our method is able to correctly imitate various skills demonstrated by a human.
ISSN:2153-0866
DOI:10.1109/IROS.2016.7759579