Back to Search Start Over

A Camera to LiDAR calibration approach through the optimization of atomic transformations.

Authors :
Pinto de Aguiar, André Silva
Riem de Oliveira, Miguel Armando
Pedrosa, Eurico Farinha
Neves dos Santos, Filipe Baptista
Source :
Expert Systems with Applications. Aug2021, Vol. 176, pN.PAG-N.PAG. 1p.
Publication Year :
2021

Abstract

• Camera to 3D laser calibration through the optimization of atomic transformations. • Novel consideration of the calibration of multiple sensors simultaneously. • Support for multiple sensor modalities. • Interactive, visual and user-oriented Robot Operating System implementation. This paper proposes a camera-to-3D Light Detection And Ranging calibration framework through the optimization of atomic transformations. The system is able to simultaneously calibrate multiple cameras with Light Detection And Ranging sensors, solving the problem of Bundle. In comparison with the state-of-the-art, this work presents several novelties: the ability to simultaneously calibrate multiple cameras and LiDARs; the support for multiple sensor modalities; the calibration through the optimization of atomic transformations, without changing the topology of the input transformation tree; and the integration of the calibration framework within the Robot Operating System (ROS) framework. The software pipeline allows the user to interactively position the sensors for providing an initial estimate, to label and collect data, and visualize the calibration procedure. To test this framework, an agricultural robot with a stereo camera and a 3D Light Detection And Ranging sensor was used. Pairwise calibrations and a single calibration of the three sensors were tested and evaluated. Results show that the proposed approach produces accurate calibrations when compared to the state-of-the-art, and is robust to harsh conditions such as inaccurate initial guesses or small amount of data used in calibration. Experiments have shown that our optimization process can handle an angular error of approximately 20 degrees and a translation error of 0.5 meters, for each sensor. Moreover, the proposed approach is able to achieve state-of-the-art results even when calibrating the entire system simultaneously. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
176
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
150127359
Full Text :
https://doi.org/10.1016/j.eswa.2021.114894