Back to Search
Start Over
An Overview on Visual SLAM: From Tradition to Semantic.
- Source :
- Remote Sensing; Jul2022, Vol. 14 Issue 13, p3010-N.PAG, 47p
- Publication Year :
- 2022
-
Abstract
- Visual SLAM (VSLAM) has been developing rapidly due to its advantages of low-cost sensors, the easy fusion of other sensors, and richer environmental information. Traditional visionbased SLAM research has made many achievements, but it may fail to achieve wished results in challenging environments. Deep learning has promoted the development of computer vision, and the combination of deep learning and SLAM has attracted more and more attention. Semantic information, as high-level environmental information, can enable robots to better understand the surrounding environment. This paper introduces the development of VSLAM technology from two aspects: traditional VSLAM and semantic VSLAM combined with deep learning. For traditional VSLAM, we summarize the advantages and disadvantages of indirect and direct methods in detail and give some classical VSLAM open-source algorithms. In addition, we focus on the development of semantic VSLAM based on deep learning. Starting with typical neural networks CNN and RNN, we summarize the improvement of neural networks for the VSLAM system in detail. Later, we focus on the help of target detection and semantic segmentation for VSLAM semantic information introduction. We believe that the development of the future intelligent era cannot be without the help of semantic technology. Introducing deep learning into the VSLAM system to provide semantic information can help robots better perceive the surrounding environment and provide people with higher-level help. [ABSTRACT FROM AUTHOR]
- Subjects :
- DEEP learning
COMPUTER vision
Subjects
Details
- Language :
- English
- ISSN :
- 20724292
- Volume :
- 14
- Issue :
- 13
- Database :
- Complementary Index
- Journal :
- Remote Sensing
- Publication Type :
- Academic Journal
- Accession number :
- 157998442
- Full Text :
- https://doi.org/10.3390/rs14133010