Back to Search Start Over

Robust Onboard Localization in Changing Environments Exploiting Text Spotting

Authors :
Zimmerman, Nicky
Wiesmann, Louis
Guadagnino, Tiziano
Läbe, Thomas
Behley, Jens
Stachniss, Cyrill
Zimmerman, Nicky
Wiesmann, Louis
Guadagnino, Tiziano
Läbe, Thomas
Behley, Jens
Stachniss, Cyrill
Publication Year :
2022

Abstract

Robust localization in a given map is a crucial component of most autonomous robots. In this paper, we address the problem of localizing in an indoor environment that changes and where prominent structures have no correspondence in the map built at a different point in time. To overcome the discrepancy between the map and the observed environment caused by such changes, we exploit human-readable localization cues to assist localization. These cues are readily available in most facilities and can be detected using RGB camera images by utilizing text spotting. We integrate these cues into a Monte Carlo localization framework using a particle filter that operates on 2D LiDAR scans and camera data. By this, we provide a robust localization solution for environments with structural changes and dynamics by humans walking. We evaluate our localization framework on multiple challenging indoor scenarios in an office environment. The experiments suggest that our approach is robust to structural changes and can run on an onboard computer. We release an open source implementation of our approach (upon paper acceptance), which uses off-the-shelf text spotting, written in C++ with a ROS wrapper.<br />Comment: This work has been accepted to IROS 2022. Copyright may be transferred without notice, after which this version may no longer be accessible

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333759147
Document Type :
Electronic Resource