IEEE Access (Jan 2024)
Deep Feature-Based Neighbor Similarity Hashing With Adversarial Learning for Cross-Modal Retrieval
Abstract
Currently, deep hashing methods for cross-modal retrieval have achieved significant performance. However, label-based pairwise semantic keep correspondence within bounds of tags, while overlooking the connection between the essence of content. To solve the above-mentioned problem, we propose a novel deep hashing framework, named Deep Feature-based Neighbor Similarity Hashing with adversarial learning (DFNSH) to associate fine grained semantic relations and map high-level semantic similarity into binary codes. Specifically, to guarantee the semantic consistency beyond labels, the feature-based neighbor similarity matrix is developed using data feature vectors, independent of labels. Moreover, for feature vectors extraction, two Contrastive Language-Image Pre-training (CLIP) networks are employed as the backbone to obtain more representative characters. Furthermore, adversarial training manner sufficiently extract intra-modal information and autonomously investigate inner-modal heterogeneous correlations. Extensive experiments on three public benchmark datasets demonstrate that DFNSH achieves promising performance with respect to different evaluation metrics. The code can be downloaded at https://github.com/Lisa-Likun/DFNSH.
Keywords