Embodiment-Specific Representation of Robot Grasping using Graphical Models and Latent-Space Discretization
We study embodiment-specific robot grasping tasks, represented in a probabilistic framework. The framework consists of a Bayesian network (BN) integrated with a novel multi-variate discretization model. The BN models the probabilistic relationships among tasks, objects, grasping actions and constrai...
Saved in:
Published in | 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) p. 980 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
2011
|
Series | IEEE International Conference on Intelligent Robots and Systems |
Online Access | Get full text |
Cover
Loading…
Summary: | We study embodiment-specific robot grasping tasks, represented in a probabilistic framework. The framework consists of a Bayesian network (BN) integrated with a novel multi-variate discretization model. The BN models the probabilistic relationships among tasks, objects, grasping actions and constraints. The discretization model provides compact data representation that allows efficient learning of the conditional structures in the BN. To evaluate the framework, we use a database generated in a simulated environment including examples of a human and a robot hand interacting with objects. The results show that the different kinematic structures of the hands affect both the BN structure and the conditional distributions over the modeled variables. Both models achieve accurate task classification, and successfully encode the semantic task requirements in the continuous observation spaces. In an imitation experiment, we demonstrate that the representation framework can transfer task knowledge between different embodiments, therefore is a suitable model for grasp planning and imitation in a goal-directed manner. |
---|---|
ISBN: | 1612844545 9781612844541 |
DOI: | 10.1109/IROS.2011.6048145 |