Applied Sciences (Jan 2025)

Privacy Auditing in Differential Private Machine Learning: The Current Trends

  • Ivars Namatevs,
  • Kaspars Sudars,
  • Arturs Nikulins,
  • Kaspars Ozols

DOI
https://doi.org/10.3390/app15020647
Journal volume & issue
Vol. 15, no. 2
p. 647

Abstract

Read online

Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithms often contain subtle vulnerabilities. Therefore, there is a need for effective methods that can audit (ϵ,δ) differentially private algorithms before they are deployed in the real world. The article examines studies that recommend privacy guarantees for differential private machine learning. It covers a wide range of topics on the subject and provides comprehensive guidance for privacy auditing schemes based on privacy attacks to protect machine-learning models from privacy leakage. Our results contribute to the growing literature on differential privacy in the realm of privacy auditing and beyond and pave the way for future research in the field of privacy-preserving models.

Keywords