Back to Search Start Over

Cross-Modal Hierarchical Modelling for Fine-Grained Sketch Based Image Retrieval

Authors :
Sain, Aneeshan
Bhunia, Ayan Kumar
Yang, Yongxin
Xiang, Tao
Song, Yi-Zhe
Publication Year :
2020
Publisher :
arXiv, 2020.

Abstract

Sketch as an image search query is an ideal alternative to text in capturing the fine-grained visual details. Prior successes on fine-grained sketch-based image retrieval (FG-SBIR) have demonstrated the importance of tackling the unique traits of sketches as opposed to photos, e.g., temporal vs. static, strokes vs. pixels, and abstract vs. pixel-perfect. In this paper, we study a further trait of sketches that has been overlooked to date, that is, they are hierarchical in terms of the levels of detail -- a person typically sketches up to various extents of detail to depict an object. This hierarchical structure is often visually distinct. In this paper, we design a novel network that is capable of cultivating sketch-specific hierarchies and exploiting them to match sketch with photo at corresponding hierarchical levels. In particular, features from a sketch and a photo are enriched using cross-modal co-attention, coupled with hierarchical node fusion at every level to form a better embedding space to conduct retrieval. Experiments on common benchmarks show our method to outperform state-of-the-arts by a significant margin.<br />Comment: Accepted for ORAL presentation in BMVC 2020

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....49fafb4aad2fcb71a8040c1f6157037e
Full Text :
https://doi.org/10.48550/arxiv.2007.15103