Kinesthetic teaching of visuomotor coordination for pointing by the humanoid robot iCub

Pointing at something refers to orienting the hand, the arm, the head or the body in the direction of an object or an event. This skill constitutes a basic communicative ability for cognitive agents like, e.g. humanoid robots. The goal of this study is to show that approximate and, in particular, pr...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 112; pp. 179 - 188
Main Authors Lemme, Andre, Freire, Ananda, Barreto, Guilherme, Steil, Jochen
Format Journal Article
LanguageEnglish
Published Elsevier B.V 18.07.2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Pointing at something refers to orienting the hand, the arm, the head or the body in the direction of an object or an event. This skill constitutes a basic communicative ability for cognitive agents like, e.g. humanoid robots. The goal of this study is to show that approximate and, in particular, precise pointing can be learned as a direct mapping from the object's pixel coordinates in the visual field to hand positions or to joint angles. This highly nonlinear mapping defines the pose and orientation of a robot's arm. The study underlines that this is possible without calculating the object's depth and 3D position explicitly since only the direction is required. To this aim, three state-of-the-art neural network paradigms (multilayer perceptron, extreme learning machine and reservoir computing) are evaluated on real world data gathered from the humanoid robot iCub. Training data are interactively generated and recorded from kinesthetic teaching for the case of precise pointing. Successful generalization is verified on the iCub using a laser pointer attached to its hand.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2012.12.040