REGRAD: A Large-Scale Relational Grasp Dataset for Safe and Object-Specific Robotic Grasping in Clutter
Despite the impressive progress achieved in robotic grasping, robots are not skilled in sophisticated tasks (e.g. search and grasp a specified target in clutter). Such tasks involve not only grasping but the comprehensive perception of the world (e.g. the object relationships). Recently, encouraging...
Saved in:
Published in | IEEE robotics and automation letters Vol. 7; no. 2; pp. 2929 - 2936 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.04.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Despite the impressive progress achieved in robotic grasping, robots are not skilled in sophisticated tasks (e.g. search and grasp a specified target in clutter). Such tasks involve not only grasping but the comprehensive perception of the world (e.g. the object relationships). Recently, encouraging results demonstrate that it is possible to understand high-level concepts by learning. However, such algorithms are usually data-intensive, and the lack of data severely limits their performance. In this letter, we present a new dataset named REGRAD for the learning of relationships among objects and grasps. We collect the annotations of object poses, segmentations, grasps, and relationships for the target-driven relational grasping tasks. Our dataset is collected in both forms of 2D images and 3D point clouds. Moreover, since all the data are generated automatically, it is free to import new objects for data generation. We also released a real-world validation dataset to evaluate the sim-to-real performance of models trained on REGRAD. Finally, we conducted a series of experiments, showing that the models trained on REGRAD could generalize well to the realistic scenarios, in terms of both relationship and grasp detection. Our dataset and code could be found at. 1 |
---|---|
ISSN: | 2377-3766 2377-3766 |
DOI: | 10.1109/LRA.2022.3142401 |