Sensor fusion-based approach for the field robot localization on Rovitis 4.0 vineyard robot

This study proposed an approach for robot localization using data from multiple low-cost sensors with two goals in mind, to produce accurate localization data and to keep the computation as simple as possible. The approach used data from wheel odometry, inertial-motion data from the Inertial Motion...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of agricultural and biological engineering Vol. 15; no. 6; pp. 91 - 95
Main Authors Rakun, Jurij, Pantano, Matteo, Lepej, Peter, Lakota, Miran
Format Journal Article
LanguageEnglish
Published Beijing International Journal of Agricultural and Biological Engineering (IJABE) 01.11.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This study proposed an approach for robot localization using data from multiple low-cost sensors with two goals in mind, to produce accurate localization data and to keep the computation as simple as possible. The approach used data from wheel odometry, inertial-motion data from the Inertial Motion Unit (IMU), and a location fix from a Real-Time Kinematics Global Positioning System (RTK GPS). Each of the sensors is prone to errors in some situations, resulting in inaccurate localization. The odometry is affected by errors caused by slipping when turning the robot or putting it on slippery ground. The IMU produces drifts due to vibrations, and RTK GPS does not return to an accurate fix in (semi-) occluded areas. None of these sensors is accurate enough to produce a precise reading for a sound localization of the robot in an outdoor environment. To solve this challenge, sensor fusion was implemented on the robot to prevent possible localization errors. It worked by selecting the most accurate readings in a given moment to produce a precise pose estimation. To evaluate the approach, two different tests were performed, one with robot localization from the robot operating system (ROS) repository and the other with the presented Field Robot Localization. The first did not perform well, while the second did and was evaluated by comparing the location and orientation estimate with ground truth, captured by a hovering drone above the testing ground, which revealed an average error of 0.005 m±0.220 m in estimating the position, and 0.6°±3.5° when estimating orientation. The tests proved that the developed field robot localization is accurate and robust enough to be used on a ROVITIS 4.0 vineyard robot.
ISSN:1934-6344
1934-6352
DOI:10.25165/j.ijabe.20221506.6415