Back to Search Start Over

JRA-Net: Joint representation attention network for correspondence learning.

Authors :
Shi, Ziwei
Xiao, Guobao
Zheng, Linxin
Ma, Jiayi
Chen, Riqing
Source :
Pattern Recognition. Mar2023, Vol. 135, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

• We design a three layer deep learning framework for outlier rejection. • We propose a novel joint representation attention mechanism. • We design an innovative weight function to improve the generalization ability. • Experimental results show the proposed network is superior to state of the art networks. In this paper, we propose a Joint Representation Attention Network (JRA-Net), an end-to-end network, to establish reliable correspondences for image pairs. The initial correspondences generated by the local feature descriptor usually suffer from heavy outliers, which makes the network unable to learn a powerful enough representation for distinguishing inliers and outliers. To this end, we design a novel attention mechanism. The proposed attention mechanism not only takes into account the correlations between global context and geometric information, but also introduces the joint representation of different scales to suppress trivial correspondences and highlight crucial correspondences. In addition, to improve the generalization ability of attention mechanism, we present an innovative weight function, to effectively adjust the importance of the attention mechanism in a learning manner. Finally, by combining the above components, the proposed JRA-Net is able to effectively infer the probabilities of correspondences being inliers. Empirical experiments on challenging datasets demonstrate the effectiveness and generalization of JRA-Net. We achieve remarkable improvements compared with the current state-of-the-art approaches on outlier rejection and relative pose estimation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
135
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
160538780
Full Text :
https://doi.org/10.1016/j.patcog.2022.109180