Back to Search Start Over

Visual and thermal data for pedestrian and cyclist detection

Authors :
Stratis Kanarachos
Mark Elshaw
M. Nazmul Huda
Chitta Saha
Sujan Rajbhandari
Sarfraz Ahmed
Source :
Towards Autonomous Robotic Systems ISBN: 9783030253318, TAROS (2), Annual Conference Towards Autonomous Robotic Systems
Publication Year :
2019

Abstract

© Springer Nature Switzerland AG 2019. With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

Details

ISBN :
978-3-030-25331-8
ISBNs :
9783030253318
Database :
OpenAIRE
Journal :
Towards Autonomous Robotic Systems ISBN: 9783030253318, TAROS (2), Annual Conference Towards Autonomous Robotic Systems
Accession number :
edsair.doi.dedup.....710d20d25bb862113986010288603973