Back to Search Start Over

Deep parameter-free attention hashing for image retrieval.

Authors :
Yang, Wenjing
Wang, Liejun
Cheng, Shuli
Source :
Scientific Reports. 4/30/2022, Vol. 12 Issue 1, p1-20. 20p.
Publication Year :
2022

Abstract

Deep hashing method is widely applied in the field of image retrieval because of its advantages of low storage consumption and fast retrieval speed. There is a defect of insufficiency feature extraction when existing deep hashing method uses the convolutional neural network (CNN) to extract images semantic features. Some studies propose to add channel-based or spatial-based attention modules. However, embedding these modules into the network can increase the complexity of model and lead to over fitting in the training process. In this study, a novel deep parameter-free attention hashing (DPFAH) is proposed to solve these problems, that designs a parameter-free attention (PFA) module in ResNet18 network. PFA is a lightweight module that defines an energy function to measure the importance of each neuron and infers 3-D attention weights for feature map in a layer. A fast closed-form solution for this energy function proves that the PFA module does not add any parameters to the network. Otherwise, this paper designs a novel hashing framework that includes the hash codes learning branch and the classification branch to explore more label information. The like-binary codes are constrained by a regulation term to reduce the quantization error in the continuous relaxation. Experiments on CIFAR-10, NUS-WIDE and Imagenet-100 show that DPFAH method achieves better performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20452322
Volume :
12
Issue :
1
Database :
Academic Search Index
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
156619797
Full Text :
https://doi.org/10.1038/s41598-022-11217-5