Nature Communications (Apr 2022)
Communication-efficient federated learning via knowledge distillation
Abstract
This work presents a communication-efficient federated learning method that saves a major fraction of communication cost. It reveals the advantage of reciprocal learning in machine knowledge transfer and the evolutional low-rank properties of deep model updates.