1. Landslide Susceptibility Mapping by Fusing Convolutional Neural Networks and Vision Transformer.
- Author
-
Bao, Shuai, Liu, Jiping, Wang, Liang, Konečný, Milan, Che, Xianghong, Xu, Shenghua, and Li, Pengpeng
- Subjects
LANDSLIDE hazard analysis ,CONVOLUTIONAL neural networks ,DEEP learning ,EMERGENCY management ,TRANSFORMER models - Abstract
Landslide susceptibility mapping (LSM) is an important decision basis for regional landslide hazard risk management, territorial spatial planning and landslide decision making. The current convolutional neural network (CNN)-based landslide susceptibility mapping models do not adequately take into account the spatial nature of texture features, and vision transformer (ViT)-based LSM models have high requirements for the amount of training data. In this study, we overcome the shortcomings of CNN and ViT by fusing these two deep learning models (bottleneck transformer network (BoTNet) and convolutional vision transformer network (ConViT)), and the fused model was used to predict the probability of landslide occurrence. First, we integrated historical landslide data and landslide evaluation factors and analysed whether there was covariance in the landslide evaluation factors. Then, the testing accuracy and generalisation ability of the CNN, ViT, BoTNet and ConViT models were compared and analysed. Finally, four landslide susceptibility mapping models were used to predict the probability of landslide occurrence in Pingwu County, Sichuan Province, China. Among them, BoTNet and ConViT had the highest accuracy, both at 87.78%, an improvement of 1.11% compared to a single model, while ConViT had the highest F1-socre at 87.64%, an improvement of 1.28% compared to a single model. The results indicate that the fusion model of CNN and ViT has better LSM performance than the single model. Meanwhile, the evaluation results of this study can be used as one of the basic tools for landslide hazard risk quantification and disaster prevention in Pingwu County. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF