Imitation learning of hand gestures and its evaluation for humanoid robots

This paper presents a platform to implement and evaluate a learning by imitation framework which enables humanoid robots to learn hand gestures from human beings. A marker based system is used to capture human motion data. From this data we extract the shoulder and elbow joint angles, which uniquely...

Full description

Saved in:
Bibliographic Details
Published in2010 International Conference on Information and Automation pp. 60 - 65
Main Authors Thobbi, Anand, Weihua Sheng
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2010
Subjects
Online AccessGet full text
ISBN1424457017
9781424457014
DOI10.1109/ICINFA.2010.5512333

Cover

Loading…
More Information
Summary:This paper presents a platform to implement and evaluate a learning by imitation framework which enables humanoid robots to learn hand gestures from human beings. A marker based system is used to capture human motion data. From this data we extract the shoulder and elbow joint angles, which uniquely characterize a particular hand gesture. The proposed imitation learning framework aims to generalize over multiple demonstrations of the same hand gesture and thus learn it. The set of joint angle trajectories used for training are first aligned temporally using Dynamic Time Warping (DTW) and then generalized by weighted averaging. The framework operates in the joint space. The algorithm has been implemented and tested on the Nao Humanoid robot. We also propose a novel method to evaluate the proposed imitation learning framework. We place markers on the robot's arm analogous to the placement of markers on the human subject's arm, and then compare the respective joint angle trajectories.
ISBN:1424457017
9781424457014
DOI:10.1109/ICINFA.2010.5512333