Back to Search
Start Over
Gaussian Radar Transformer for Semantic Segmentation in Noisy Radar Data
- Publication Year :
- 2022
-
Abstract
- Scene understanding is crucial for autonomous robots in dynamic environments for making future state predictions, avoiding collisions, and path planning. Camera and LiDAR perception made tremendous progress in recent years, but face limitations under adverse weather conditions. To leverage the full potential of multi-modal sensor suites, radar sensors are essential for safety critical tasks and are already installed in most new vehicles today. In this paper, we address the problem of semantic segmentation of moving objects in radar point clouds to enhance the perception of the environment with another sensor modality. Instead of aggregating multiple scans to densify the point clouds, we propose a novel approach based on the self-attention mechanism to accurately perform sparse, single-scan segmentation. Our approach, called Gaussian Radar Transformer, includes the newly introduced Gaussian transformer layer, which replaces the softmax normalization by a Gaussian function to decouple the contribution of individual points. To tackle the challenge of the transformer to capture long-range dependencies, we propose our attentive up- and downsampling modules to enlarge the receptive field and capture strong spatial relations. We compare our approach to other state-of-the-art methods on the RadarScenes data set and show superior segmentation quality in diverse environments, even without exploiting temporal information.<br />Comment: Accepted for publication in IEEE Robotics and Automation Letters (RA-L)
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2212.03690
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1109/LRA.2022.3226030