Visual Navigation of Caged Chicken Coop Inspection Robot Based on Road Features

The speed and accuracy of navigation road extraction and driving stability affect the inspection accuracy of cage chicken coop inspection robots. In this paper, a new grayscale factor (4B-3R-2G) was proposed to achieve fast and accurate road extraction, and a navigation line fitting algorithm based...

Full description

Saved in:
Bibliographic Details
Published inAnimals (Basel) Vol. 14; no. 17; p. 2515
Main Authors Deng, Hongfeng, Zhang, Tiemin, Li, Kan, Yang, Jikang
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 29.08.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The speed and accuracy of navigation road extraction and driving stability affect the inspection accuracy of cage chicken coop inspection robots. In this paper, a new grayscale factor (4B-3R-2G) was proposed to achieve fast and accurate road extraction, and a navigation line fitting algorithm based on the road boundary features was proposed to improve the stability of the algorithm. The proposed grayscale factor achieved 92.918% segmentation accuracy, and the speed was six times faster than the deep learning model. The experimental results showed that at the speed of 0.348 m/s, the maximum deviation of the visual navigation was 4 cm, the average deviation was 1.561 cm, the maximum acceleration was 1.122 m/s2, and the average acceleration was 0.292 m/s2, with the detection number and accuracy increased by 21.125% and 1.228%, respectively. Compared with inertial navigation, visual navigation can significantly improve the navigation accuracy and stability of the inspection robot and lead to better inspection effects. The visual navigation system proposed in this paper has better driving stability, higher inspection efficiency, better inspection effect, and lower operating costs, which is of great significance to promote the automation process of large-scale cage chicken breeding and realize rapid and accurate monitoring.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2076-2615
2076-2615
DOI:10.3390/ani14172515