Velocity control for safe robot guidance based on fused vision and force/torque data

We present a method for securing guided robot motions in terms of human/robot cooperation. For this, we limit the maximum allowable velocity of the robot based on the distance to the human or to the next obstacle and generate the effective velocity using guidance informations provided by the interac...

Full description

Saved in:
Bibliographic Details
Published in2006 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems pp. 485 - 492
Main Authors Kuhn, S., Gecks, T., Henrich, D.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2006
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a method for securing guided robot motions in terms of human/robot cooperation. For this, we limit the maximum allowable velocity of the robot based on the distance to the human or to the next obstacle and generate the effective velocity using guidance informations provided by the interacting human. Therefore, we fuse the two heterogenous data types of a camera and a force torque sensor. The cameras are used to monitor the robot's workspace applying a difference image method. Given this obstacle information, distances are calculated between the robot and humans or objects in the environment respectively. The distance within each image is determined via an extended difference image method. The distances acquired from each camera are fused to approximate the real robot to object distance within the workspace. This distance regulates the maximum allowable velocity of the robot. The force/torque sensor provides the guidance information, i.e. amount, direction of the force and moment. This information is used to generate the robot's movement taking the maximum allowable velocity into consideration
ISBN:1424405661
9781424405664
DOI:10.1109/MFI.2006.265623