Back to Search
Start Over
Mobile Robot Self-Localization Using Omnidirectional Vision with Feature Matching from Real and Virtual Spaces.
- Source :
- Applied Sciences (2076-3417); Apr2021, Vol. 11 Issue 8, p3360, 14p
- Publication Year :
- 2021
-
Abstract
- This paper presents a novel self-localization technique for mobile robots based on image feature matching from omnidirectional vision. The proposed method first constructs a virtual space with synthetic omnidirectional imaging to simulate a mobile robot equipped with an omnidirectional vision system in the real world. In the virtual space, a number of vertical and horizontal lines are generated according to the structure of the environment. They are imaged by the virtual omnidirectional camera using the catadioptric projection model. The omnidirectional images derived from the virtual and real environments are then used to match the synthetic lines and real scene edges. Finally, the pose and trajectory of the mobile robot in the real world are estimated by the efficient perspective-n-point (EPnP) algorithm based on the line feature matching. In our experiments, the effectiveness of the proposed self-localization technique was validated by the navigation of a mobile robot in a real world environment. [ABSTRACT FROM AUTHOR]
- Subjects :
- MOBILE robots
ROBOT vision
IMAGE registration
VIRTUAL reality
VISION
Subjects
Details
- Language :
- English
- ISSN :
- 20763417
- Volume :
- 11
- Issue :
- 8
- Database :
- Complementary Index
- Journal :
- Applied Sciences (2076-3417)
- Publication Type :
- Academic Journal
- Accession number :
- 150434477
- Full Text :
- https://doi.org/10.3390/app11083360