Back to Search Start Over

TransY-Net:Learning Fully Transformer Networks for Change Detection of Remote Sensing Images

Authors :
Yan, Tianyu
Wan, Zifu
Zhang, Pingping
Cheng, Gong
Lu, Huchuan
Publication Year :
2023

Abstract

In the remote sensing field, Change Detection (CD) aims to identify and localize the changed regions from dual-phase images over the same places. Recently, it has achieved great progress with the advances of deep learning. However, current methods generally deliver incomplete CD regions and irregular CD boundaries due to the limited representation ability of the extracted visual features. To relieve these issues, in this work we propose a novel Transformer-based learning framework named TransY-Net for remote sensing image CD, which improves the feature extraction from a global view and combines multi-level visual features in a pyramid manner. More specifically, the proposed framework first utilizes the advantages of Transformers in long-range dependency modeling. It can help to learn more discriminative global-level features and obtain complete CD regions. Then, we introduce a novel pyramid structure to aggregate multi-level visual features from Transformers for feature enhancement. The pyramid structure grafted with a Progressive Attention Module (PAM) can improve the feature representation ability with additional inter-dependencies through spatial and channel attentions. Finally, to better train the whole framework, we utilize the deeply-supervised learning with multiple boundary-aware loss functions. Extensive experiments demonstrate that our proposed method achieves a new state-of-the-art performance on four optical and two SAR image CD benchmarks. The source code is released at https://github.com/Drchip61/TransYNet.<br />Comment: This work is accepted by TGRS2023. It is an extension of our ACCV2022 paper and arXiv:2210.00757

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.14214
Document Type :
Working Paper