Navigation path extraction for garden mobile robot based on road median point

Extracting navigation paths is the key to autonomous navigation of garden mobile robots. Bottlenecks such as lack of reliability under high dynamic interference and difficulty in eliminating error interference have limited the large-scale industrialized implementation of garden mobile robots, which...

Full description

Saved in:
Bibliographic Details
Published inEURASIP journal on advances in signal processing Vol. 2025; no. 1; pp. 6 - 21
Main Author Li, Wei
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 27.02.2025
Springer
Springer Nature B.V
SpringerOpen
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Extracting navigation paths is the key to autonomous navigation of garden mobile robots. Bottlenecks such as lack of reliability under high dynamic interference and difficulty in eliminating error interference have limited the large-scale industrialized implementation of garden mobile robots, which are directly related to the accuracy, reliability and safety of the navigation system. In order to cope with these challenges, this paper proposes a navigation path extraction method for garden mobile robots based on road median points. After semantic level perception of the scene in this paper, the 1920 × 360 pixel region at the bottom of the image is taken as region of interest. Then, an edge detection method is proposed as a basis to locate the road median point, and discrete navigation point prediction by machine learning. The anti-interference detection strategy of “local + global” two-stage joint elimination of interference points is adopted to avoid the accuracy of the fitted path due to severe interference. Combined with the idea of “turning curves into straights”, the navigation paths are fitted. The experimental results show that the proposed navigation path extraction method has stronger adaptive ability, higher anti-jamming ability and accuracy, which makes the method more attractive for practical use.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1687-6180
1687-6172
1687-6180
DOI:10.1186/s13634-025-01209-8