Back to Search Start Over

Predicting Polarization Beyond Semantics for Wearable Robotics

Authors :
Kailun Yang
Eduardo Romera
Luis M. Bergasa
Kaiwei Wang
Xiao Huang
Source :
Humanoids
Publication Year :
2018
Publisher :
IEEE, 2018.

Abstract

Semantic perception is a key enabler in robotics, which supposes a very resourceful and efficient manner of applying vision information for upper-level navigation and manipulation tasks. Given the challenges on specular semantics such as water hazards, transparent glasses and metallic surfaces, polarization imaging has been explored to complement the RGB-based pixel-wise semantic segmentation because it reflects surface characteristics and provides additional attributes. However, polarimetric measurements generally entail prohibitively expensive cameras and highly accurate calibrations. Inspired by the representation power of Convolutional Neural Networks (CNNs), we propose to predict polarization information from monocular RGB images, precisely per-pixel polarization difference. The core of our approach is a cluster of efficient deep architectures building on factorized convolutions, hierarchical dilations and pyramid representations, aimed to produce both semantic and polarimetric estimations in real time. Comprehensive experiments demonstrate the qualified accuracy on a wearable exoskeleton humanoid robot.

Details

Database :
OpenAIRE
Journal :
2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)
Accession number :
edsair.doi...........b14aca52f7042371aecddeca3e5f7d33
Full Text :
https://doi.org/10.1109/humanoids.2018.8625005