Back to Search Start Over

Cross-modal hashing with missing labels.

Authors :
Ni, Haomin
Zhang, Jianjun
Kang, Peipei
Fang, Xiaozhao
Sun, Weijun
Xie, Shengli
Han, Na
Source :
Neural Networks. Aug2023, Vol. 165, p60-76. 17p.
Publication Year :
2023

Abstract

Hashing-based cross-modal retrieval methods have become increasingly popular due to their advantages in storage and speed. While current methods have demonstrated impressive results, there are still several issues that have not been addressed. Specifically, many of these approaches assume that labels are perfectly assigned, despite the fact that in real-world scenarios, labels are often incomplete or partially missing. There are two reasons for this, as manual labeling can be a complex and time-consuming task, and annotators may only be interested in certain objects. As such, cross-modal retrieval with missing labels is a significant challenge that requires further attention. Moreover, the similarity between labels is frequently ignored, which is important for exploring the high-level semantics of labels. To address these limitations, we propose a novel method called Cross-Modal Hashing with Missing Labels (CMHML). Our method consists of several key components. First, we introduce Reliable Label Learning to preserve reliable information from the observed labels. Next, to infer the uncertain part of the predicted labels, we decompose the predicted labels into latent representations of labels and samples. The representation of samples is extracted from different modalities, which assists in inferring missing labels. We also propose Label Correlation Preservation to enhance the similarity between latent representations of labels. Hash codes are then learned from the representation of samples through Global Approximation Learning. We also construct a similarity matrix according to predicted labels and embed it into hash codes learning to explore the value of labels. Finally, we train linear classifiers to map original samples to a low-dimensional Hamming space. To evaluate the efficacy of CMHML, we conduct extensive experiments on four publicly available datasets. Our method is compared to other state-of-the-art methods, and the results demonstrate that our model performs competitively even when most labels are missing. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
165
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
169815599
Full Text :
https://doi.org/10.1016/j.neunet.2023.05.035