Back to Search Start Over

Development of a Burned Area Processor Based on Sentinel-2 Data Using Deep Learning

Authors :
Lisa Knopp
Source :
PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science. 89:357-358
Publication Year :
2021
Publisher :
Springer Science and Business Media LLC, 2021.

Abstract

Wildfires have major ecological, social and economic consequences: they destroy flora and fauna, contribute to global warming by raising the carbon emissions and affect lives and property. In order to coordinate emergency forces on site and assess damage and consequences, information about location, extent and frequency of fires is needed. Remote sensing is particularly suited to provide this information as satellite sensors such as Sentinel-2 deliver global information at a high repetition rate. Over the last decades, several methods have been developed to detect burned areas in satellite imagery. Since 2001, artificial neural networks have been used as well, but so far only shallow architectures and not deep convolutional neural networks that are nowadays often used for image segmentation. The aim of this thesis is to analyse the potential of deep learning for burned area detection. Therefore a convolutional neural network based on a U-Net architecture is implemented and tested. The dataset for training, validation and testing consists of Sentinel-2 post-fire imagery and corresponding burned area reference masks originating from the California Department of Forestry and Fire Protection, the Portuguese Institute for Nature Conservation and Forests and the German Aerospace Center. Several performance tests on different optimizers and input data are conducted in order to define a network model especially suited for burned area detection. The final model receives the Sentinel-2 spectral bands of the visual (B2, B3, B4), near infrared (B8) and shortwave infrared (B11, B12) domain and thus allows transferability to Landsat imagery with similar spectral channels. It is trained with radiometrically and geometrically augmented data and uses Adagrad algorithm for loss optimization. The model achieves an overall accuracy of 0.96 and a kappa coefficient of 0.91 and thus proves the suitability of deep learning for burned area detection. An independent qualitative accuracy assessment with reference data of the Copernicus Emergency Mapping Service confirms these results, but also identifies problems with harvested cropland and rocky terrain. Compared to a bi-temporal classification approach currently being in use at the German Aerospace Center, the methodology developed in this thesis achieves similar overall accuracy with significantly shorter processing time. In addition, the final network model tends to produce homogenous burned areas with smooth delineations rather than a very detailed classification. However, classification results can easily be fine-tuned towards a higher precision by raising the probability threshold to predict a pixel as burned.

Details

ISSN :
25122819 and 25122789
Volume :
89
Database :
OpenAIRE
Journal :
PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science
Accession number :
edsair.doi.dedup.....1ca68ac2cca6ad607c50cbae6b0b29b4
Full Text :
https://doi.org/10.1007/s41064-021-00177-6