Scientific Reports (May 2024)

Global prototype distillation for heterogeneous federated learning

  • Shu Wu,
  • Jindou Chen,
  • Xueli Nie,
  • Yong Wang,
  • Xiancun Zhou,
  • Linlin Lu,
  • Wei Peng,
  • Yao Nie,
  • Waseef Menhaj

DOI
https://doi.org/10.1038/s41598-024-62908-0
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 11

Abstract

Read online

Abstract Federated learning is a distributed machine learning paradigm where the goal is to collaboratively train a high quality global model while private training data remains local over distributed clients. However, heterogenous data distribution over clients is severely challenging for federated learning system, which severely damage the quality of model. In order to address this challenge, we propose global prototype distillation (FedGPD) for heterogenous federated learning to improve performance of global model. The intuition is to use global class prototypes as knowledge to instruct local training on client side. Eventually, local objectives will be consistent with the global optima so that FedGPD learns an improved global model. Experiments show that FedGPD outperforms previous state-of-art methods by 0.22% ~1.28% in terms of average accuracy on representative benchmark datasets.

Keywords