Back to Search Start Over

Geometry Sensitive Cross-Modal Reasoning for Composed Query Based Image Retrieval

Authors :
Feifei Zhang
Mingliang Xu
Changsheng Xu
Source :
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. 31
Publication Year :
2022

Abstract

Composed Query Based Image Retrieval (CQBIR) aims at retrieving images relevant to a composed query containing a reference image with a requested modification expressed via a textual sentence. Compared with the conventional image retrieval which takes one modality as query to retrieve relevant data of another modality, CQBIR poses great challenge over the semantic gap between the reference image and modification text in the composed query. To solve the challenge, previous methods either resort to feature composition that cannot model interactions in the query or explore inter-modal attention while ignoring the spatial structure and visual-semantic relationship. In this paper, we propose a geometry sensitive cross-modal reasoning network for CQBIR by jointly modeling the geometric information of the image and the visual-semantic relationship between the reference image and modification text in the query. Specifically, it contains two key components: a geometry sensitive inter-modal attention module (GS-IMA) and a text-guided visual reasoning module (TG-VR). The GS-IMA introduces the spatial structure into the inter-modal attention in both implicit and explicit manners. The TG-VR models the unequal semantics not included in the reference image to guide further visual reasoning. As a result, our method can learn effective feature for the composed query which does not exhibit literal alignment. Comprehensive experimental results on three standard benchmarks demonstrate that the proposed model performs favorably against state-of-the-art methods.

Details

ISSN :
19410042
Volume :
31
Database :
OpenAIRE
Journal :
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Accession number :
edsair.doi.dedup.....695701de19abe62857cefd837d804e63