Nature Communications (Apr 2022)

Communication-efficient federated learning via knowledge distillation

  • Chuhan Wu,
  • Fangzhao Wu,
  • Lingjuan Lyu,
  • Yongfeng Huang,
  • Xing Xie

DOI
https://doi.org/10.1038/s41467-022-29763-x
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 8

Abstract

Read online

This work presents a communication-efficient federated learning method that saves a major fraction of communication cost. It reveals the advantage of reciprocal learning in machine knowledge transfer and the evolutional low-rank properties of deep model updates.