Embodiment-specific representation of robot grasping using graphical models and latent-space discretization

We study embodiment-specific robot grasping tasks, represented in a probabilistic framework. The framework consists of a Bayesian network (BN) integrated with a novel multi-variate discretization model. The BN models the probabilistic relationships among tasks, objects, grasping actions and constrai...

Full description

Saved in:
Bibliographic Details
Published in2011 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 980 - 986
Main Authors Dan Song, Ek, C. H., Huebner, K., Kragic, D.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2011
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study embodiment-specific robot grasping tasks, represented in a probabilistic framework. The framework consists of a Bayesian network (BN) integrated with a novel multi-variate discretization model. The BN models the probabilistic relationships among tasks, objects, grasping actions and constraints. The discretization model provides compact data representation that allows efficient learning of the conditional structures in the BN. To evaluate the framework, we use a database generated in a simulated environment including examples of a human and a robot hand interacting with objects. The results show that the different kinematic structures of the hands affect both the BN structure and the conditional distributions over the modeled variables. Both models achieve accurate task classification, and successfully encode the semantic task requirements in the continuous observation spaces. In an imitation experiment, we demonstrate that the representation framework can transfer task knowledge between different embodiments, therefore is a suitable model for grasp planning and imitation in a goal-directed manner.
ISBN:1612844545
9781612844541
ISSN:2153-0858
2153-0866
DOI:10.1109/IROS.2011.6094503