Manipulating articulated objects with interactive perception

Robust robotic manipulation and perception remains a difficult challenge, in particular in unstructured environments. To address this challenge, we propose to couple manipulation and perception. The robot observes its own deliberate interactions with the world. These interactions reveal sensory info...

Full description

Saved in:
Bibliographic Details
Published in2008 IEEE International Conference on Robotics and Automation pp. 272 - 277
Main Authors Katz, D., Brock, O.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2008
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Robust robotic manipulation and perception remains a difficult challenge, in particular in unstructured environments. To address this challenge, we propose to couple manipulation and perception. The robot observes its own deliberate interactions with the world. These interactions reveal sensory information that would otherwise remain hidden and facilitate the interpretation of perceptual data. To demonstrate the effectiveness of interactive perception we present a skill for the manipulation of articulated objects. We show how UMan, our mobile manipulation platform, obtains a kinematic model of an unknown object. The model then enables the robot to perform purposeful manipulation. Our algorithm is extremely robust, and does not require prior knowledge of the object; it is insensitive to lighting, texture, color, specularities, background, and is computationally highly efficient.
ISBN:1424416469
9781424416462
ISSN:1050-4729
DOI:10.1109/ROBOT.2008.4543220