Applied Sciences (Dec 2023)

Multi-Label Weighted Contrastive Cross-Modal Hashing

  • Zeqian Yi,
  • Xinghui Zhu,
  • Runbing Wu,
  • Zhuoyang Zou,
  • Yi Liu,
  • Lei Zhu

DOI
https://doi.org/10.3390/app14010093
Journal volume & issue
Vol. 14, no. 1
p. 93

Abstract

Read online

Due to the low storage cost and high computation efficiency of hashing, cross-modal hashing has been attracting widespread attention in recent years. In this paper, we investigate how supervised cross-modal hashing (CMH) benefits from multi-label and contrastive learning (CL) by overcoming the following two challenges: (i) how to combine multi-label and supervised contrastive learning to consider diverse relationships among cross-modal instances, and (ii) how to reduce the sparsity of multi-label representation so as to improve the similarity measurement accuracy. To this end, we propose a novel cross-modal hashing framework, dubbed Multi-Label Weighted Contrastive Hashing (MLWCH). This framework involves compact consistent similarity representation, a new designed multi-label similarity calculation method that efficiently reduces the sparsity of multi-label by reducing redundant zero elements. Furthermore, a novel multi-label weighted contrastive learning strategy is developed to significantly improve hashing learning by assigning similarity weight to positive samples under both linear and non-linear similarities. Extensive experiments and ablation analysis over three benchmark datasets validate the superiority of our MLWCH method, especially over several outstanding baselines.

Keywords