Electronic Research Archive (Apr 2023)

Hierarchical federated learning with global differential privacy

  • Youqun Long,
  • Jianhui Zhang,
  • Gaoli Wang,
  • Jie Fu

DOI
https://doi.org/10.3934/era.2023190
Journal volume & issue
Vol. 31, no. 7
pp. 3741 – 3758

Abstract

Read online

Federated learning (FL) is a framework which is used in distributed machine learning to obtain an optimal model from clients' local updates. As an efficient design in model convergence and data communication, cloud-edge-client hierarchical federated learning (HFL) attracts more attention than the typical cloud-client architecture. However, the HFL still poses threats to clients' sensitive data by analyzing the upload and download parameters. In this paper, to address information leakage effectively, we propose a novel privacy-preserving scheme based on the concept of differential privacy (DP), adding Gaussian noises to the shared parameters when uploading them to edge and cloud servers and broadcasting them to clients. Our algorithm can obtain global differential privacy with adjustable noises in the architecture. We evaluate the performance on image classification tasks. In our experiment on the Modified National Institute of Standards and Technology (MNIST) dataset, we get 91% model accuracy-layer HFL-DP, our design is more secure while as being accurate.

Keywords