Back to Search
Start Over
An Attitude Estimation Method Based on Monocular Vision and Inertial Sensor Fusion for Indoor Navigation
- Source :
- IEEE Sensors Journal. 21:27051-27061
- Publication Year :
- 2021
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2021.
-
Abstract
- The attitude of moving objects has a predictive function for navigation and positioning in a dim indoor environment. Currently, the accuracy of vision and inertial sensors’ attitude estimation is low, as it mainly affected by factors such as motion image blur, attitude angle processing algorithms, and data synchronization. Firstly, in this paper we propose a novel continuous multi-frame evaluation and registration algorithm to obtain high signal-to-noise ratio (SNR) images in regular indoor environments without global positioning system (GPS). The coordinates of vanishing points and plumb lines were extracted in the strong texture image. Then, a visual attitude model was constructed based on the characteristic points and lines. A pre-integrated gyro sensor was used to establish the inertial attitude model. The visual and gyro attitude models were used to obtain high-precision attitude information using multi-rate filtering. Finally, we build a hardware attachment composed of a consumer camera, an inexpensive inertial sensor and hardware circuit based on digital signal processing, which effectively demonstrates that the experiment result compares favourably with the more traditional methods.
- Subjects :
- Inertial frame of reference
business.industry
Computer science
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
Sensor fusion
Inertial measurement unit
Global Positioning System
Data synchronization
Computer vision
Artificial intelligence
Electrical and Electronic Engineering
Vanishing point
business
Instrumentation
Monocular vision
Digital signal processing
Subjects
Details
- ISSN :
- 23799153 and 1530437X
- Volume :
- 21
- Database :
- OpenAIRE
- Journal :
- IEEE Sensors Journal
- Accession number :
- edsair.doi...........281c1f5be84fff9909b4fe960728bf31
- Full Text :
- https://doi.org/10.1109/jsen.2021.3119289