Back to Search Start Over

TAILORED FEATURES FOR SEMANTIC SEGMENTATION WITH A DGCNN USING FREE TRAINING SAMPLES OF A COLORED AIRBORNE POINT CLOUD

Authors :
E. Widyaningrum
M. K. Fajari
R. C. Lindenbergh
M. Hahn
Source :
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XLIII-B2-2020, Pp 339-346 (2020)
Publication Year :
2020
Publisher :
Copernicus Publications, 2020.

Abstract

Automation of 3D LiDAR point cloud processing is expected to increase the production rate of many applications including automatic map generation. Fast development on high-end hardware has boosted the expansion of deep learning research for 3D classification and segmentation. However, deep learning requires large amount of high quality training samples. The generation of training samples for accurate classification results, especially for airborne point cloud data, is still problematic. Moreover, which customized features should be used best for segmenting airborne point cloud data is still unclear. This paper proposes semi-automatic point cloud labelling and examines the potential of combining different tailor-made features for pointwise semantic segmentation of an airborne point cloud. We implement a Dynamic Graph CNN (DGCNN) approach to classify airborne point cloud data into four land cover classes: bare-land, trees, buildings and roads. The DGCNN architecture is chosen as this network relates two approaches, PointNet and graph CNNs, to exploit the geometric relationships between points. For experiments, we train an airborne point cloud and co-aligned orthophoto of the Surabaya city area of Indonesia to DGCNN using three different tailor-made feature combinations: points with RGB (Red, Green, Blue) color, points with original LiDAR features (Intensity, Return number, Number of returns) so-called IRN, and points with two spectral colors and Intensity (Red, Green, Intensity) so-called RGI. The overall accuracy of the testing area indicates that using RGB information gives the best segmentation results of 81.05% while IRN and RGI gives accuracy values of 76.13%, and 79.81%, respectively.

Details

Language :
English
ISSN :
16821750 and 21949034
Volume :
XLIII-B2-2020
Database :
Directory of Open Access Journals
Journal :
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publication Type :
Academic Journal
Accession number :
edsdoj.b174caefbc442fa243c479425721d0
Document Type :
article
Full Text :
https://doi.org/10.5194/isprs-archives-XLIII-B2-2020-339-2020