Back to Search Start Over

A Systematic Literature Review of Deep Learning Approaches for Sketch-Based Image Retrieval : Datasets, Metrics, and Future Directions

Authors :
Yang, Fan
Ismail, Nor Azman
Pang, Yee Yong
Kebande, Victor R.
Al-Dhaqm, Arafat
Koh, Tieng Wei
Yang, Fan
Ismail, Nor Azman
Pang, Yee Yong
Kebande, Victor R.
Al-Dhaqm, Arafat
Koh, Tieng Wei
Publication Year :
2024

Abstract

Sketch-based image retrieval (SBIR) utilizes sketches to search for images containing similar objects or scenes. Due to the proliferation of touch-screen devices, sketching has become more accessible and therefore has received increasing attention. Deep learning has emerged as a potential tool for SBIR, allowing models to automatically extract image features and learn from large amounts of data. To the best of our knowledge, there is currently no systematic literature review (SLR) of SBIR with deep learning. Therefore, the aim of this review is to incorporate related works into a systematic study, highlighting the main contributions of individual researchers over the years, with a focus on past, present and future trends. To achieve the purpose of this study, 90 studies from 2016 to June 2023 in 4 databases were collected and analyzed using the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) framework. The specific models, datasets, evaluation metrics, and applications of deep learning in SBIR are discussed in detail. This study found that Convolutional Neural Networks (CNN) and Generative Adversarial Networks (GAN) are the most widely used deep learning methods for SBIR. A commonly used dataset is Sketchy, especially in the latest Zero-shot sketch-based image retrieval (ZS-SBIR) task. The results show that Mean Average Precision (mAP) is the most commonly used metric for quantitative evaluation of SBIR. Finally, we provide some future directions and guidance for researchers based on the results of this review. © 2013 IEEE.

Details

Database :
OAIster
Notes :
application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1442716335
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1109.ACCESS.2024.3357939