IEEE Access (Jan 2021)

Distributed Supervised Discrete Hashing With Relaxation

  • Rui Hu,
  • Ming Ye,
  • Changyou Ma,
  • Feng Chen

DOI
https://doi.org/10.1109/ACCESS.2021.3074947
Journal volume & issue
Vol. 9
pp. 63729 – 63739

Abstract

Read online

The data-dependent hash methods are becoming more and more attractive because they perform well in fast retrieval and storing high-dimensional data. Most existing supervised hashes are centralized, such as supervised discrete hashing (SDH) and supervised discrete hashing with relaxation (SDHR). The SDH algorithm determines the regression target by using ordinary least squares regression and the traditional zero-one matrix encoding of class label information. And SDHR is a constraint to the regression target matrix so that each example is correctly classified and satisfies a larger margin, so as to achieve the purpose of optimizing the regression target. In real environment, a large amount of data will be distributed in different nodes. Therefore, the centralized hash method has great limitations. In this article, we propose distributed supervised discrete hashing algorithm with relaxation (DSDHR) based on SDHR. The SDHR algorithm is introduced into the distributed network. In this framework, all nodes share a centralized hash learning model. At the same time, in order to ensure that the distributed hash algorithm is updated in parallel on multiple nodes, consistency constraints are introduced. In each node, alternative iterative methods are used to obtain the binary hash code, regression target and hash function. Experiments show that DSDHR has certain competitive advantages over some centralized hashing algorithms and some distributed hashing algorithms.

Keywords