Image-Based Visual Servoing Control of Robot Manipulators Using Hybrid Algorithm With Feature Constraints

The challenge in addressing uncalibrated visual servoing (VS) control of robot manipulators with unstructured environments is to obtain appropriate interaction matrix and keep the image features in the field of view (FOV), especially when the non-Gaussian noise disturbance exists in the VS process....

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 223495 - 223508
Main Authors Ren, Xiaolin, Li, Hongwen, Li, Yuanchun
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The challenge in addressing uncalibrated visual servoing (VS) control of robot manipulators with unstructured environments is to obtain appropriate interaction matrix and keep the image features in the field of view (FOV), especially when the non-Gaussian noise disturbance exists in the VS process. In this article, a hybrid control algorithm which combines bidirectional extreme learning machine (B-ELM) with smooth variable structure filter (SVSF) is proposed to estimate interaction matrix and tackle visibility constraints. For VS, the nonlinear mapping between image features and interaction matrix is approximated using the B-ELM learning. To increase the capability of anti-interference, the SVSF is employed to re-estimate interaction matrix. A constraint function presenting feature coordinates and region boundaries is given and added to the velocity controller, which drags image features away from the restricted region and ensures the smoothness of the velocities. Since the camera and robot model parameters are not required in developing the control strategy, the servoing task can be fulfilled flexibly and simply. Simulation and experimental results on a conventional 6-degree-of-freedom manipulator verify the effectiveness of the proposed method.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3042207