ATLASv2: LLM-Guided Adaptive Landmark Acquisition and Navigation on the Edge
DOI:
https://doi.org/10.1609/aaaiss.v5i1.35588Abstract
Autonomous systems deployed on edge devices face significant challenges, including resource constraints, real-time processing demands, and adapting to dynamic environments. This work introduces ATLASv2, a novel system that integrates a fine-tuned TinyLLM, real-time object detection, and efficient path planning to enable hierarchical, multi-task navigation and manipulation all on the edge device, Jetson Nano. ATLASv2 dynamically expands its navigable land- marks by detecting and localizing objects in the environment which are saved to its internal knowledge base to be used for future task execution. We evaluate ATLASv2 in real-world environments, including a handcrafted home and office setting constructed with diverse objects and landmarks. Results show that ATLASv2 effectively interprets natural language instructions, decomposes them into low-level actions, and executes tasks with high success rates. By leveraging generative AI in a fully on-board framework, ATLASv2 achieves optimized resource utilization with minimal prompting latency and power consumption, bridging the gap between simulated environments and real-world applications.Downloads
Published
2025-05-28
How to Cite
Walczak, M., Kallakuri, U., & Mohsenin, T. (2025). ATLASv2: LLM-Guided Adaptive Landmark Acquisition and Navigation on the Edge. Proceedings of the AAAI Symposium Series, 5(1), 196–203. https://doi.org/10.1609/aaaiss.v5i1.35588
Issue
Section
GenAI@Edge: Empowering Generative AI at the Edge