Back to Search
Start Over
Feature Matching and Position Matching Between Optical and SAR With Local Deep Feature Descriptor
- Source :
- IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 15, Pp 448-462 (2022)
- Publication Year :
- 2022
- Publisher :
- IEEE, 2022.
-
Abstract
- Image matching between the optical and synthetic aperture radar (SAR) is one of the most fundamental problems for earth observation. In recent years, many researchers have used hand-made descriptors with their expertise to find matches between optical and SAR images. However, due to the large nonlinear radiation difference between optical images and SAR images, the image matching becomes very difficult. To deal with the problems, the article proposes an efficient feature matching and position matching algorithm (MatchosNet) based on local deep feature descriptor. First, A new dataset is presented by collecting a large number of corresponding SAR images and optical images. Then a deep convolutional network with dense blocks and cross stage partial networks is designed to generate deep feature descriptors. Next, the hard L2 loss function and ARCpatch loss function are designed to improve matching effect. In addition, on the basis of feature matching, the two-dimensional (2-D) Gaussian function voting algorithm is designed to further match the position of optical images and SAR images of different sizes. Finally, a large number of quantitative experiments show that MatchosNet has a excellent matching effect in feature matching and position matching. The code will be released at: https://github.com/LiaoYun0x0/Feature-Matching-and-Position-Matching-between-Optical-and-SAR.
Details
- Language :
- English
- ISSN :
- 21511535
- Volume :
- 15
- Database :
- OpenAIRE
- Journal :
- IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
- Accession number :
- edsair.doi.dedup.....e37cd5d4f6556c521290aaa0b4e5fb72