1. The Spatially Seamless Spatiotemporal Fusion Model Based on Generative Adversarial Networks
- Author
-
ChenYang Weng, Yulin Zhan, Xingfa Gu, Jian Yang, Yan Liu, Hong Guo, Zilong Lian, Shiyuan Zhang, Zhangjie Wang, and Xuechun Zhao
- Subjects
Deep learning ,error distribution ,Gaofen-6 ,spatial resolution ,spatiotemporal fusion ,splicing method ,Ocean engineering ,TC1501-1800 ,Geophysics. Cosmic physics ,QC801-809 - Abstract
Spatiotemporal fusion is a method of fusing high spatial resolution low temporal resolution remote sensing images and low spatial resolution high temporal resolution in order to obtain high spatiotemporal resolution remote sensing images, which can provide data support for temporal observation of fine objects, and plays an important role in the fields of Earth sciences, environmental monitoring, and so on. This article reveals an issue that is often overlooked in the field of deep learning-based spatiotemporal fusion: the discontinuity between image blocks and image blocks. This discontinuity may have an impact on the visualization of remote sensing images and subsequent applications. In this regard, this article proposes a spatially seamless stitching approach to optimize the spatiotemporal fusion model based on deep learning. By using this method, we successfully obtain high-quality fused remote sensing images with smoother transitions. The spatiotemporal fusion model used in the experiment is a generative adversarial network-based spatiotemporal fusion model (GAN-STFM) and the data are from the Beijing Gaofen-6 dataset (BJGF6). After our splicing method, the ratio of root-mean-square error (RMSE) at the splicing seam to the overall RMSE is reduced from 1.28 to 0.99, which effectively improves the continuity of the image. This new image splicing method has the potential to improve the utility of deep learning-based spatiotemporal fusion algorithms, which has application value for generating large-scale long time series remote sensing datasets with high temporal and high spatial resolution.
- Published
- 2024
- Full Text
- View/download PDF