Back to Search Start Over

Egomotion from event-based SNN optical flow

Authors :
Universitat Politècnica de Catalunya. Doctorat en Automàtica, Robòtica i Visió
Universitat Politècnica de Catalunya. RAIG - Mobile Robotics and Artificial Intelligence Group
Tian, Yi
Andrade-Cetto, Juan
Universitat Politècnica de Catalunya. Doctorat en Automàtica, Robòtica i Visió
Universitat Politècnica de Catalunya. RAIG - Mobile Robotics and Artificial Intelligence Group
Tian, Yi
Andrade-Cetto, Juan
Publication Year :
2023

Abstract

We present a method for computing egomotion using event cameras with a pre-trained optical flow spiking neural network (SNN). To address the aperture problem encountered in the sparse and noisy normal flow of the initial SNN layers, our method includes a sliding-window bin-based pooling layer that computes a fused full flow estimate. To add robustness to noisy flow estimates, instead of computing the egomotion from vector averages, our method optimizes the intersection of constraints. The method also includes a RANSAC step to robustly deal with outlier flow estimates in the pooling layer. We validate our approach on both simulated and real scenes and compare our results favorably to the state-of-the-art methods. However, our method may be sensitive to datasets and motion speeds different from those used for training, limiting its generalizability.<br />This work received support from projects EBCON (PID2020-119244GBI00) and AUDEL (TED2021-131759A-I00) funded by MCIN/ AEI/ 10.13039/ 501100011033 and by the "European Union NextGenerationEU/PRTR"; the Consolidated Research Group RAIG (2021 SGR 00510) of the Departament de Recerca i Universitats de la Generalitat de Catalunya; and by an FI AGAUR PhD grant to Yi Tian.<br />Peer Reviewed<br />Postprint (author's final draft)

Details

Database :
OAIster
Notes :
8 p., application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1409476513
Document Type :
Electronic Resource