Back to Search Start Over

Remote Sensing Pansharpening by Full-Depth Feature Fusion

Authors :
Zi-Rong Jin
Yu-Wei Zhuo
Tian-Jing Zhang
Xiao-Xu Jin
Shuaiqi Jing
Liang-Jian Deng
Source :
Remote Sensing, Vol 14, Iss 466, p 466 (2022), Remote Sensing; Volume 14; Issue 3; Pages: 466
Publication Year :
2022
Publisher :
MDPI AG, 2022.

Abstract

Pansharpening is an important yet challenging remote sensing image processing task, which aims to reconstruct a high-resolution (HR) multispectral (MS) image by fusing a HR panchromatic (PAN) image and a low-resolution (LR) MS image. Though deep learning (DL)-based pansharpening methods have achieved encouraging performance, they are infeasible to fully utilize the deep semantic features and shallow contextual features in the process of feature fusion for a HR-PAN image and LR-MS image. In this paper, we propose an efficient full-depth feature fusion network (FDFNet) for remote sensing pansharpening. Specifically, we design three distinctive branches called PAN-branch, MS-branch, and fusion-branch, respectively. The features extracted from the PAN and MS branches will be progressively injected into the fusion branch at every different depth to make the information fusion more broad and comprehensive. With this structure, the low-level contextual features and high-level semantic features can be characterized and integrated adequately. Extensive experiments on reduced- and full-resolution datasets acquired from WorldView-3, QuickBird, and GaoFen-2 sensors demonstrate that the proposed FDFNet only with less than 100,000 parameters performs better than other detail injection-based proposals and several state-of-the-art approaches, both visually and quantitatively.

Details

Language :
English
ISSN :
20724292
Volume :
14
Issue :
466
Database :
OpenAIRE
Journal :
Remote Sensing
Accession number :
edsair.doi.dedup.....d0cf5cf990cafab328299a8950271026