Back to Search Start Over

Vision-based navigation of omnidirectional mobile robots

Authors :
Antonio Paolillo
Marilena Vendittelli
Marco Ferro
Andrea Cherubini
Dipartimento di Ingegneria informatica automatica e gestionale (DIAG)
Università degli Studi di Roma 'La Sapienza' = Sapienza University [Rome]
Interactive Digital Humans (IDH)
Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM)
Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM)-Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM)
Source :
IEEE Robotics and Automation Letters, IEEE Robotics and Automation Letters, IEEE 2019, 4 (3), pp.2691-2698. ⟨10.1109/LRA.2019.2913077⟩
Publication Year :
2019
Publisher :
HAL CCSD, 2019.

Abstract

International audience; This paper considers the problem of collision-free navigation of omnidirectional mobile robots in environments with obstacles. Information from a monocular camera, encoders, and an inertial measurement unit is used to achieve the task. Three different visual servoing control schemes, compatible with the class of considered robot kinematics and sensor equipment, are analysed and their robustness properties with respect to actuation inaccuracies discussed. Then, a controller is proposed with formal guarantee of convergence to the bisector of a corridor. The main controller components are a visual servoing control scheme and a velocity estimation algorithm integrating visual, kinematic and inertial information. The behaviour of all the considered algorithms is analised and illustrated through simulations both for a wheeled and a humanoid robot. The solution proposed as the most efficient and robust with respect to actuation inaccuracies is also validated experimentally on a real humanoid NAO.

Details

Language :
English
ISSN :
23773766
Database :
OpenAIRE
Journal :
IEEE Robotics and Automation Letters, IEEE Robotics and Automation Letters, IEEE 2019, 4 (3), pp.2691-2698. ⟨10.1109/LRA.2019.2913077⟩
Accession number :
edsair.doi.dedup.....db7d7f84dd9d3d0d1fb074c88c532304
Full Text :
https://doi.org/10.1109/LRA.2019.2913077⟩