Method and System for Generating Virtual Boundary of Working Area of Autonomous Mobile Robot, and Autonomous Mobile Robot and Readable Storage Medium

Disclosed are a method and system for generating a virtual boundary of a working region of a self-moving robot, and a self-moving robot and a readable storage medium. The method comprises the following steps: acquiring several recording points of a mobile positioning module circling along a patrol p...

Full description

Saved in:
Bibliographic Details
Main Authors GAO, Xiangyang, Chen, Hong, ZHU, Shaoming
Format Patent
LanguageEnglish
Published 28.03.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Disclosed are a method and system for generating a virtual boundary of a working region of a self-moving robot, and a self-moving robot and a readable storage medium. The method comprises the following steps: acquiring several recording points of a mobile positioning module circling along a patrol path; storing recording points corresponding to the first circle walked in a first storage linked list, and storing the remaining recording points in a second storage linked list; successively retrieving each recording point in the first storage linked list and taking the recording point as a basic coordinate point, and querying the second storage linked list to successively select m recording point groups corresponding to each basic coordinate point; according to each basic coordinate point and the m recording point groups corresponding thereto, respectively acquiring a boundary fitting point corresponding to the basic coordinate point, and forming a boundary fitting point sequence; acquiring boundary points according to the boundary fitting point sequence; and successively connecting the boundary points to generate a virtual boundary of a working region. In the present disclosure, a virtual boundary is generated according to recording points corresponding to a patrol path, such that human labor costs are reduced, and the working efficiency is improved.
Bibliography:Application Number: US202017768577