Back to Search Start Over

Consistency Self-Training Semi-Supervised Method for Road Extraction from Remote Sensing Images.

Authors :
Gu, Xingjian
Yu, Supeng
Huang, Fen
Ren, Shougang
Fan, Chengcheng
Source :
Remote Sensing. Nov2024, Vol. 16 Issue 21, p3945. 15p.
Publication Year :
2024

Abstract

Road extraction techniques based on remote sensing image have significantly advanced. Currently, fully supervised road segmentation neural networks based on remote sensing images require a significant number of densely labeled road samples, limiting their applicability in large-scale scenarios. Consequently, semi-supervised methods that utilize fewer labeled data have gained increasing attention. However, the imbalance between a small quantity of labeled data and a large volume of unlabeled data leads to local detail errors and overall cognitive mistakes in semi-supervised road extraction. To address this challenge, this paper proposes a novel consistency self-training semi-supervised method (CSSnet), which effectively learns from a limited number of labeled data samples and a large amount of unlabeled data. This method integrates self-training semi-supervised segmentation with semi-supervised classification. The semi-supervised segmentation component relies on an enhanced generative adversarial network for semantic segmentation, which significantly reduces local detail errors. The semi-supervised classification component relies on an upgraded mean-teacher network to handle overall cognitive errors. Our method exhibits excellent performance with a modest amount of labeled data. This study was validated on three separate road datasets comprising high-resolution remote sensing satellite images and UAV photographs. Experimental findings showed that our method consistently outperformed state-of-the-art semi-supervised methods and several classic fully supervised methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
16
Issue :
21
Database :
Academic Search Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
180782455
Full Text :
https://doi.org/10.3390/rs16213945