IEEE Access (Jan 2020)

EdgeFed: Optimized Federated Learning Based on Edge Computing

  • Yunfan Ye,
  • Shen Li,
  • Fang Liu,
  • Yonghao Tang,
  • Wanting Hu

DOI
https://doi.org/10.1109/ACCESS.2020.3038287
Journal volume & issue
Vol. 8
pp. 209191 – 209198

Abstract

Read online

Federated learning (FL) has received considerable attention with the development of mobile internet technology, which is an emerging framework to train a deep learning model from decentralized data. Modern mobile devices often have access to rich but privacy-sensitive data, and computational abilities are often limited because of the hardware restriction. In previous works based on federated averaging (FedAvg) algorithm, mobile devices need to perform lots of calculations, and it is time-consuming in the process of global communication. Inspired by edge computing, we proposed edge federated learning (EdgeFed), which separates the process of updating the local model that is supposed to be completed independently by mobile devices. The outputs of mobile devices are aggregated in the edge server to improve the learning efficiency and decrease the global communication frequency. Empirical experiments demonstrate that our proposed EdgeFed has advantages in different bandwidth scenarios. Especially, by offloading part of the calculations from mobile clients to the edge server, the computational cost of the mobile devices and the global communication expense can be simultaneously reduced as compared to FedAvg.

Keywords