Back to Search Start Over

Multiple-environment Self-adaptive Network for aerial-view geo-localization.

Authors :
Wang, Tingyu
Zheng, Zhedong
Sun, Yaoqi
Yan, Chenggang
Yang, Yi
Chua, Tat-Seng
Source :
Pattern Recognition. Aug2024, Vol. 152, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Aerial-view geo-localization tends to determine an unknown position through matching the drone-view image with the geo-tagged satellite-view image. This task is mostly regarded as an image retrieval problem. The key underpinning this task is to design a series of deep neural networks to learn discriminative image descriptors. However, existing methods meet large performance drops under realistic weather, such as rain and fog, since they do not take the domain shift between the training data and multiple test environments into consideration. To minor this domain gap, we propose a Multiple-environment Self-adaptive Network (MuSe-Net) to dynamically adjust the domain shift caused by environmental changing. In particular, MuSe-Net employs a two-branch neural network containing one multiple-environment style extraction network and one self-adaptive feature extraction network. As the name implies, the multiple-environment style extraction network is to extract the environment-related style information, while the self-adaptive feature extraction network utilizes an adaptive modulation module to dynamically minimize the environment-related style gap. Extensive experiments on three widely-used benchmarks, i.e. , University-1652, SUES-200, and CVUSA, demonstrate that the proposed MuSe-Net achieves a competitive result for geo-localization in multiple environments. Furthermore, we observe that the proposed method also shows great potential to the unseen extreme weather, such as mixing the fog, rain and snow. • Identifying one key challenge in visual geo-localization: weather and illumination changes. • Presenting MuSe-Net to alleviate the interference caused by environmental changes. • Designing Residual SPADE for efficient training and feature discrimination boosting. • Results on three geo-localization benchmarks confirm the superiority of our MuSe-Net. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
152
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
176784458
Full Text :
https://doi.org/10.1016/j.patcog.2024.110363