Electronics Letters (Sep 2023)

Two‐stage personalized federated learning based on sparse pretraining

  • Tong Liu,
  • Kaixuan Xie,
  • Yi Kong,
  • Guojun Chen,
  • Yinfei Xu,
  • Lun Xin,
  • Fei Yu

DOI
https://doi.org/10.1049/ell2.12943
Journal volume & issue
Vol. 59, no. 17
pp. n/a – n/a

Abstract

Read online

Abstract Aiming at solving the performance degradation of federated learning (FL) under heterogeneous data distribution, personalized FL (PFL) was proposed. It is designed to produce a dedicated model for each client. However, the existing PFL solution only focuses on the performance of personalized model, ignoring the performance of global model, which will affect the willingness of new clients to participate. In order to solve this problem, this paper proposes a new PFL solution, a two‐stage PFL based on sparse pretraining, which can not only train a sparse personalized model for each client, but also obtain a sparse global model. The whole training process is divided into sparse pretraining and sparse personalized training, which focus on the performance of global model and personalized model respectively. Also, we propose a mask sparse aggregation technique to maintain the sparsity of the global model in the sparse personalized training stage. Experimental results show that compared with existing algorithms, our proposed algorithm can improve the accuracy of the global model while maintaining advanced personalized model accuracy, and has higher communication efficiency.

Keywords