Back to Search Start Over

LP-SLAM: Language-Perceptive RGB-D SLAM system based on Large Language Model

Authors :
Zhang, Weiyi
Guo, Yushi
Niu, Liting
Li, Peijun
Zhang, Chun
Wan, Zeyu
Yan, Jiaxiang
Farrukh, Fasih Ud Din
Zhang, Debing
Publication Year :
2023

Abstract

Simultaneous localization and mapping (SLAM) is a critical technology that enables autonomous robots to be aware of their surrounding environment. With the development of deep learning, SLAM systems can achieve a higher level of perception of the environment, including the semantic and text levels. However, current works are limited in their ability to achieve a natural-language level of perception of the world. To address this limitation, we propose LP-SLAM, the first language-perceptive SLAM system that leverages large language models (LLMs). LP-SLAM has two major features: (a) it can detect text in the scene and determine whether it represents a landmark to be stored during the tracking and mapping phase, and (b) it can understand natural language input from humans and provide guidance based on the generated map. We illustrated three usages of the LLM in the system including text cluster, landmark judgment, and natural language navigation. Our proposed system represents an advancement in the field of LLMs based SLAM and opens up new possibilities for autonomous robots to interact with their environment in a more natural and intuitive way.<br />Comment: 12 pages, 16 figures

Subjects

Subjects :
Computer Science - Robotics

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2303.10089
Document Type :
Working Paper