Morphology Specific Stepwise Learning of In-Hand Manipulation With a Four-Fingered Hand

In past research, in-hand object manipulation for various sized and shaped objects has been achieved. However, the network had to be trained for each different motion. Training data takes time to acquire and increases the hardware load, thereby increasing the cost for training data. Four-fingered in...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on industrial informatics Vol. 16; no. 1; pp. 433 - 441
Main Authors Funabashi, Satoshi, Schmitz, Alexander, Ogasa, Shun, Sugano, Shigeki
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In past research, in-hand object manipulation for various sized and shaped objects has been achieved. However, the network had to be trained for each different motion. Training data takes time to acquire and increases the hardware load, thereby increasing the cost for training data. Four-fingered in-hand manipulation is especially difficult as a high number of joints need to be controlled in synchrony. This paper presents a method that reduces the required training data for in-hand manipulation with the idea of pretraining and mutual finger motions. The Allegro Hand is used with soft fingertips and integrated 6-axis F/T sensors to evaluate the proposed method. To make the network more versatile, the training data included objects of various sizes and shapes. When pretraining the network, one shot learning suffices to learn a new task; mutual finger motions can be exploited to use three-fingered pretraining data for four-fingered manipulation. Both data-sharing and weight-sharing were used and show similar results. Crucially, pretraining data from fingers with the same kinematic chain has to be used, showing the importance of morphology specific learning. Moreover, objects with untrained sizes and shapes could be manipulated.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2019.2893713