Contextualized Autonomous Drone Navigation Using LLMs Deployed in Edge-Cloud Computing

Autonomous navigation is usually trained offline in diverse scenarios and fine-tuned online subject to real-world experiences. However, the real world is dynamic and changeable, and many environmental encounters/effects are not accounted for in real-time due to difficulties in describing them within...

Full description

Saved in:
Bibliographic Details
Published in2025 International Conference on Machine Learning and Autonomous Systems (ICMLAS) pp. 1373 - 1378
Main Authors Chen, Hongqian, Tang, Yun, Tsourdos, Antonios, Guo, Weisi
Format Conference Proceeding
LanguageEnglish
Published IEEE 10.03.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Autonomous navigation is usually trained offline in diverse scenarios and fine-tuned online subject to real-world experiences. However, the real world is dynamic and changeable, and many environmental encounters/effects are not accounted for in real-time due to difficulties in describing them within offline training data or hard to describe even in online scenarios. However, we know that the human operator can describe these dynamic environmental encounters through natural language, adding semantic context. The research is to deploy Large Language Models (LLMs) to perform real-time contextual code adjustment to autonomous navigation. The challenge not evaluated in literature is what LLMs are appropriate and where should these computationally heavy algorithms sit in the computation-communication edge-cloud computing architectures. In this paper, we evaluate how different LLMs can adjust both the navigation map parameters dynamically (e.g., contour map shaping) and also derive navigation task instruction sets. We then evaluate which LLMs are most suitable and where they should sit in future edge-cloud of 6G telecommunication architectures.
DOI:10.1109/ICMLAS64557.2025.10967934