Jisuanji kexue (Oct 2022)

Robust Hash Learning Method Based on Dual-teacher Self-supervised Distillation

  • MIAO Zhuang, WANG Ya-peng, LI Yang, WANG Jia-bao, ZHANG Rui, ZHAO Xin-xin

DOI
https://doi.org/10.11896/jsjkx.210800050
Journal volume & issue
Vol. 49, no. 10
pp. 159 – 168

Abstract

Read online

In order to improve the performance of unsupervised hash learning and achieve robust hashing image retrieval,this paper proposes a novel robust hash learning method based on dual-teacher self-supervised distillation.Specifically,the proposed method contains two stages:a self-supervised dual-teacher learning stage and a robust hash learning stage.In the first stage,a modified cluster algorithm is designed to effectively improve the accuracy of hard pseudo labels.Then,we fine-tune the teacher networks by hard pseudo labels to get the initial soft pseudo labels.In the second stage,we filter the initial soft pseudo labels by our soft pseudo label denoising method,which combines a hybrid denoising strategy and a dual-teacher denoising strategy.Then,we train the student network with the denoised soft pseudo labels by knowledge distillation,so that robust hash codes for label-free images are obtained.Extensive experiments on CIFAR-10,FLICKR25K and EuroSAT datasets show that the proposed robust hash learning method outperforms the state-of-the-art methods.In detail,the MAP of our method is 18.6% higher than that of the TBH method on CIFAR-10,2.4% higher than that of the DistillHash method on FLICKR25K,and 18.5% higher than that of the ETE-GAN method on EuroSAT.

Keywords