Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments

Autonomous navigation in greenhouses requires agricultural robots to localize and generate a globally consistent map of surroundings in real-time. However, accurate and robust localization and mapping are still challenging for agricultural robots due to the unstructured, dynamic and GPS-denied envir...

Full description

Saved in:
Bibliographic Details
Published inAgronomy (Basel) Vol. 12; no. 8; p. 1740
Main Authors Yan, Yaxuan, Zhang, Baohua, Zhou, Jun, Zhang, Yibo, Liu, Xiao’ang
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 23.07.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Autonomous navigation in greenhouses requires agricultural robots to localize and generate a globally consistent map of surroundings in real-time. However, accurate and robust localization and mapping are still challenging for agricultural robots due to the unstructured, dynamic and GPS-denied environmental conditions. In this study, a state-of-the-art real-time localization and mapping system was presented to achieve precise pose estimation and dense three-dimensional (3D) point cloud mapping in complex greenhouses by utilizing multi-sensor fusion and Visual–IMU–Wheel odometry. In this method, measurements from wheel odometry, an inertial measurement unit (IMU) and a tightly coupled visual–inertial odometry (VIO) are integrated into a loosely coupled framework based on the Extended Kalman Filter (EKF) to obtain a more accurate state estimation of the robot. In the multi-sensor fusion algorithm, the pose estimations from the wheel odometry and IMU are treated as predictions and the localization results from VIO are used as observations to update the state vector. Simultaneously, the dense 3D map of the greenhouse is reconstructed in real-time by employing the modified ORB-SLAM2. The performance of the proposed system was evaluated in modern standard solar greenhouses with harsh environmental conditions. Taking advantage of measurements from individual sensors, our method is robust enough to cope with various challenges, as shown by extensive experiments conducted in the greenhouses and outdoor campus environment. Additionally, the results show that our proposed framework can improve the localization accuracy of the visual–inertial odometry, demonstrating the satisfactory capability of the proposed approach and highlighting its promising applications in autonomous navigation of agricultural robots.
ISSN:2073-4395
2073-4395
DOI:10.3390/agronomy12081740