Applied Sciences (Sep 2022)

Towards Federated Learning with Byzantine-Robust Client Weighting

  • Amit Portnoy,
  • Yoav Tirosh,
  • Danny Hendler

DOI
https://doi.org/10.3390/app12178847
Journal volume & issue
Vol. 12, no. 17
p. 8847

Abstract

Read online

Federated learning (FL) is a distributed machine learning paradigm where data are distributed among clients who collaboratively train a model in a computation process coordinated by a central server. By assigning a weight to each client based on the proportion of data instances it possesses, the rate of convergence to an accurate joint model can be greatly accelerated. Some previous works studied FL in a Byzantine setting, in which a fraction of the clients may send arbitrary or even malicious information regarding their model. However, these works either ignore the issue of data unbalancedness altogether or assume that client weights are a priori known to the server, whereas, in practice, it is likely that weights will be reported to the server by the clients themselves and therefore cannot be relied upon. We address this issue for the first time by proposing a practical weight-truncation-based preprocessing method and demonstrating empirically that it is able to strike a good balance between model quality and Byzantine robustness. We also establish analytically that our method can be applied to a randomly selected sample of client weights.

Keywords