Back to Search Start Over

Multi‐vehicle multi‐sensor occupancy grid map fusion in vehicular networks.

Authors :
Meng, Xi
Duan, Dongliang
Feng, Tao
Source :
IET Communications (Wiley-Blackwell). Jan2022, Vol. 16 Issue 1, p67-74. 8p.
Publication Year :
2022

Abstract

Sensing is an essential part in autonomous driving and intelligent transportation systems. It enables the vehicle to better understand itself and its surrounding environment. Vehicular networks support information sharing among different vehicles and hence enable the multi‐vehicle multi‐sensor cooperative sensing, which can greatly improve the sensing performance. However, there are a couple of issues to be addressed. First, the multi‐sensor data fusion needs to deal with heterogeneous data formats. Second, the cooperative sensing process needs to deal with low data quality and perception blind spots for some vehicles. In order to solve the above problems, in this paper the occupancy grid map is adopted to facilitate the fusion of multi‐vehicle and multi‐sensor data. The dynamic target detection frame and pixel information of the camera data are mapped to the static environment of the LiDAR point cloud, and the space‐based occupancy probability distribution kernel density estimation characterization fusion data is designed , and the occupancy grid map based on the probability level and the spatial level is generated. Real‐world experiments show that the proposed fusion framework is better compatible with the data information of different sensors and expands the sensing range by involving the collaborations among multiple vehicles in vehicular networks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
17518628
Volume :
16
Issue :
1
Database :
Academic Search Index
Journal :
IET Communications (Wiley-Blackwell)
Publication Type :
Academic Journal
Accession number :
154460927
Full Text :
https://doi.org/10.1049/cmu2.12314