Back to Search Start Over

Semantic Segmentation on 3D Occupancy Grids for Automotive Radar

Authors :
Robert Prophet
Juan-Carlos Fuentes-Michel
Ingo Weber
Martin Vossiek
Anastasios Deligiannis
Source :
IEEE Access, Vol 8, Pp 197917-197930 (2020)
Publication Year :
2020
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2020.

Abstract

Radar sensors have great advantages over other sensors in estimating the motion states of moving objects, because they detect velocity components within one measurement cycle. Moreover, numerous successes have already been achieved regarding the classification of such objects. However, the advantage of instantaneous velocity measurement is lost when detecting static objects, so that their classification is much more demanding. In this paper, we use semantic segmentation networks to distinguish between frequently occurring infrastructure objects. The resulting semantic grids provide a location-based classification of the vehicle environment. Since even modern radars have a significantly poorer angular resolution than lidars, the relatively thin radar point cloud is accumulated in advance and transformed into 2D or 3D grids that act as network inputs. Occupancy grids are particularly advantageous here, since they calculate not only the obstacles but also the free spaces. With suitable parameter selection, which is very challenging due to the complexity of radar measurement, the resulting grids allow for good association with camera images. Finally, in order to evaluate possible advantages of 3D grids as network input with respect to the segmentation result, we created and evaluated a simulation dataset and two different real-world datasets in car parks and on motorways. As a result, Jaccard coefficients between 81% and 88% were achieved, depending on the dataset. It was also found that 3D input images lead to improvements in the car park dataset.

Details

ISSN :
21693536
Volume :
8
Database :
OpenAIRE
Journal :
IEEE Access
Accession number :
edsair.doi.dedup.....c1c1ed4d18a8f2dfeaa722f159297423
Full Text :
https://doi.org/10.1109/access.2020.3032034