Back to Search Start Over

Visual Odometry Using Pixel Processor Arrays for Unmanned Aerial Systems in GPS Denied Environments.

Authors :
McConville A
Bose L
Clarke R
Mayol-Cuevas W
Chen J
Greatwood C
Carey S
Dudek P
Richardson T
Source :
Frontiers in robotics and AI [Front Robot AI] 2020 Sep 29; Vol. 7, pp. 126. Date of Electronic Publication: 2020 Sep 29 (Print Publication: 2020).
Publication Year :
2020

Abstract

Environments in which Global Positioning Systems (GPS), or more generally Global Navigation Satellite System (GNSS), signals are denied or degraded pose problems for the guidance, navigation, and control of autonomous systems. This can make operating in hostile GNSS-Impaired environments, such as indoors, or in urban and natural canyons, impossible or extremely difficult. Pixel Processor Array (PPA) cameras-in conjunction with other on-board sensors-can be used to address this problem, aiding in tracking, localization, and control. In this paper we demonstrate the use of a PPA device-the SCAMP vision chip-combining perception and compute capabilities on the same device for aiding in real-time navigation and control of aerial robots. A PPA consists of an array of Processing Elements (PEs), each of which features light capture, processing, and storage capabilities. This allows various image processing tasks to be efficiently performed directly on the sensor itself. Within this paper we demonstrate visual odometry and target identification running concurrently on-board a single PPA vision chip at a combined frequency in the region of 400 Hz. Results from outdoor multirotor test flights are given along with comparisons against baseline GPS results. The SCAMP PPA's High Dynamic Range (HDR) and ability to run multiple algorithms at adaptive rates makes the sensor well suited for addressing outdoor flight of small UAS in GNSS challenging or denied environments. HDR allows operation to continue during the transition from indoor to outdoor environments, and in other situations where there are significant variations in light levels. Additionally, the PPA only needs to output specific information such as the optic flow and target position, rather than having to output entire images. This significantly reduces the bandwidth required for communication between the sensor and on-board flight computer, enabling high frame rate, low power operation.<br /> (Copyright © 2020 McConville, Bose, Clarke, Mayol-Cuevas, Chen, Greatwood, Carey, Dudek and Richardson.)

Details

Language :
English
ISSN :
2296-9144
Volume :
7
Database :
MEDLINE
Journal :
Frontiers in robotics and AI
Publication Type :
Academic Journal
Accession number :
33501292
Full Text :
https://doi.org/10.3389/frobt.2020.00126