OLM-SLAM: online lifelong memory system for simultaneous localization and mapping
Simultaneous Localization and Mapping is a fundamental task for robots in unknown environments. However, the poor generalization ability of learning-based algorithms in unknown environments hinders their widespread adoption. Additionally, artificial neural networks are subject to catastrophic forget...
Saved in:
Published in | Measurement science & technology Vol. 36; no. 1; p. 16328 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
31.01.2025
|
Online Access | Get full text |
Cover
Loading…
Summary: | Simultaneous Localization and Mapping is a fundamental task for robots in unknown environments. However, the poor generalization ability of learning-based algorithms in unknown environments hinders their widespread adoption. Additionally, artificial neural networks are subject to catastrophic forgetting. We propose a lifelong SLAM framework called OLM-SLAM that effectively solves the neural network catastrophic forgetting problem. To ensure the generalization of the neural network, this paper proposes a method for the sensitivity analysis of the network weight parameters. Meanwhile, inspired by human memory storage mechanisms, we design a dual memory storages mechanism that retains dynamic memory and static memory. A novel memory filtering mechanism is proposed to maximize image diversity within a fixed-size memory storage area addressing the problem of limited storage capacity of embedded devices in real-world situations. We have extensively evaluated the model on a variety of real-world datasets. Compared with CL-SLAM, the overall translation error of the test sequence is improved by 44.9%. The translation and rotation errors of Retention Ability (RA) were improved by 111.6% and 66.7%, respectively. The results demonstrate that OLM-SLAM can outperform previous methods of the same type, and OLM-SLAM has high RA when facing different sequences of the same type of environment. |
---|---|
ISSN: | 0957-0233 1361-6501 |
DOI: | 10.1088/1361-6501/ad9347 |