IEEE Open Journal of the Communications Society (Jan 2022)

Multiple Parallel Federated Learning via Over-the-Air Computation

  • Gaoxin Shi,
  • Shuaishuai Guo,
  • Jia Ye,
  • Nasir Saeed,
  • Shuping Dang

DOI
https://doi.org/10.1109/OJCOMS.2022.3194821
Journal volume & issue
Vol. 3
pp. 1252 – 1264

Abstract

Read online

This paper investigates multiple parallel federated learning in cellular networks, where a base station schedules several FL tasks in parallel and each task has a group of devices involved. To reduce the communication overhead, over-the-air computation is introduced by utilizing the superposition property of multiple access channels (MAC) to accomplish the aggregation step. Since all devices use the same radio resource to transfer their local updates to the BS, in order to separate the received signals of different tasks, we use the zero-forcing receiver combiner to mitigate the mutual interference across different groups. Besides, we analyze the impact of receiver combiner and device selection on the convergence of our multiple parallel FL framework. Also, we formulate an optimization problem that jointly considers receiver combiner vector design and device selection for improving FL performance. We address the problem by decoupling it into two sub-problems and solve them alternatively, adopting successive convex approximation (SCA) to derive the receiver combiner vector, and then solve the device scheduling problem with a greedy algorithm. Simulation results demonstrate that the proposed framework can effectively solve the straggler issue in FL and achieve a near-optimal performance on all tasks.

Keywords