Journal of King Saud University: Computer and Information Sciences (Jan 2023)

Differentially private block coordinate descent

  • Shazia Riaz,
  • Saqib Ali,
  • Guojun Wang,
  • Asad Anees

Journal volume & issue
Vol. 35, no. 1
pp. 283 – 295

Abstract

Read online

Deep learning models have revolutionized AI tasks by producing accurate predictions. These models′ success largely depends on precise training using large-scale datasets, mostly crowdsourced from the target population. The training datasets may contain sensitive personal information, and the model parameters can encode this information on internal wirings of hidden layers, thus bearing the risk of a privacy breach. The modern trend of sharing trained models has grown the privacy breach risk manifold. The performance of existing privacy-preserving deep learning models trying to address this issue is unsatisfactory. Consequently, only a marginal proportion of these privacy-preserved models have been adopted in the industry. Therefore, we developed the first differentially private version of the block coordinate descent (BCD) algorithm. Our proposed mechanism considerably cuts the privacy cost by injecting an appropriate amount of noise in block variables. It achieves high accuracy comparable to its non-private equivalents. It improves the convergence speed and provides a provable privacy guarantee by performing privacy accounting using advanced composition and moments accountant methods. We empirically evaluate the robustness of our proposed mechanism on benchmark datasets. The results demonstrate the competitive performance in terms of both privacy cost reduction and speedy convergence against the state-of-the-art differential privacy-based mechanisms.

Keywords