A General Monocular Visual Servoing Structure for Mobile Robots in Natural Scene Using SLAM

In this paper, a general visual servoing structure for mobile robots is proposed to handle the situation that the target scene gets out of the camera view. Most existing visual servoing strategies are based on the assumption that images always share common feature points with the desired one during...

Full description

Saved in:
Bibliographic Details
Published inCognitive Systems and Signal Processing Vol. 1006; pp. 465 - 476
Main Authors Li, Chenping, Zhang, Xuebo, Gao, Haiming
Format Book Chapter
LanguageEnglish
Published Singapore Springer Singapore Pte. Limited 2019
Springer Singapore
SeriesCommunications in Computer and Information Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, a general visual servoing structure for mobile robots is proposed to handle the situation that the target scene gets out of the camera view. Most existing visual servoing strategies are based on the assumption that images always share common feature points with the desired one during the servoing procedure, which actually cannot be guaranteed by the controller. To avoid such problems, simultaneous localization and mapping (SLAM) is introduced to visual servoing system, which contains the front-end for estimating the current pose and the back-end for optimizing the desired pose of the mobile robot. Meanwhile, compared with the traditional servoing system with artificial feature points, the scale of robot poses can be fixed by the map in the proposed scheme, which makes it applicable in natural scene. In addition, all position-based visual servoing controllers are implementable in the proposed servoing architecture. The servoing structure has been implemented on a nonholonomic mobile robot and experimental results are exhibited to illustrate the effectiveness and feasibility of the proposed approach.
Bibliography:This work is supported in part by National Natural Science Foundation of China under Grants 61573195 and U1613210, in part by Tianjin Science and Technology Program under Grants 17KPXMSF00110.
ISBN:9789811379857
9811379858
ISSN:1865-0929
1865-0937
DOI:10.1007/978-981-13-7986-4_41