Back to Search Start Over

Long-tail Cross Modal Hashing

Authors :
Gao, Zijun
Wang, Jun
Yu, Guoxian
Yan, Zhongmin
Domeniconi, Carlotta
Zhang, Jinglin
Publication Year :
2022

Abstract

Existing Cross Modal Hashing (CMH) methods are mainly designed for balanced data, while imbalanced data with long-tail distribution is more general in real-world. Several long-tail hashing methods have been proposed but they can not adapt for multi-modal data, due to the complex interplay between labels and individuality and commonality information of multi-modal data. Furthermore, CMH methods mostly mine the commonality of multi-modal data to learn hash codes, which may override tail labels encoded by the individuality of respective modalities. In this paper, we propose LtCMH (Long-tail CMH) to handle imbalanced multi-modal data. LtCMH firstly adopts auto-encoders to mine the individuality and commonality of different modalities by minimizing the dependency between the individuality of respective modalities and by enhancing the commonality of these modalities. Then it dynamically combines the individuality and commonality with direct features extracted from respective modalities to create meta features that enrich the representation of tail labels, and binaries meta features to generate hash codes. LtCMH significantly outperforms state-of-the-art baselines on long-tail datasets and holds a better (or comparable) performance on datasets with balanced labels.<br />Comment: Accepted by the Thirty-Seventh AAAI Conference on Artificial Intelligence(AAAI2023)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.15162
Document Type :
Working Paper