Back to Search Start Over

A Multiscale Framework With Unsupervised Learning for Remote Sensing Image Registration.

Authors :
Ye, Yuanxin
Tang, Tengfeng
Zhu, Bai
Yang, Chao
Li, Bo
Hao, Siyuan
Source :
IEEE Transactions on Geoscience & Remote Sensing. May2022, Vol. 60, p1-15. 15p.
Publication Year :
2022

Abstract

Registration for multisensor or multimodal image pairs with a large degree of distortions is a fundamental task for many remote sensing applications. To achieve accurate and low-cost remote sensing image registration, we propose a multiscale framework with unsupervised learning, named MU-Net. Without costly ground truth labels, MU-Net directly learns the end-to-end mapping from the image pairs to their transformation parameters. MU-Net stacks several deep neural network (DNN) models on multiple scales to generate a coarse-to-fine registration pipeline, which prevents the backpropagation from falling into a local extremum and resists significant image distortions. We design a novel loss function paradigm based on structural similarity, which makes MU-Net suitable for various types of multimodal images. MU-Net is compared with traditional feature-based and area-based methods, as well as supervised and other unsupervised learning methods on the optical-optical, optical-infrared, optical-synthetic aperture radar (SAR), and optical-map datasets. Experimental results show that MU-Net achieves more comprehensive and accurate registration performance between these image pairs with geometric and radiometric distortions. We share the code implemented by Pytorch at https://github.com/yeyuanxin110/MU-Net. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01962892
Volume :
60
Database :
Academic Search Index
Journal :
IEEE Transactions on Geoscience & Remote Sensing
Publication Type :
Academic Journal
Accession number :
157582500
Full Text :
https://doi.org/10.1109/TGRS.2022.3167644