IEEE Open Journal of the Communications Society (Jan 2024)

Federated Learning in Heterogeneous Wireless Networks With Adaptive Mixing Aggregation and Computation Reduction

  • Jingxin Li,
  • Xiaolan Liu,
  • Toktam Mahmoodi

DOI
https://doi.org/10.1109/OJCOMS.2024.3381545
Journal volume & issue
Vol. 5
pp. 2164 – 2182

Abstract

Read online

Despite the recent advancements achieved by federated learning (FL), its real-world deployment is significantly impeded by the heterogeneous learning environment, specifically manifesting as devices with various computing capabilities, non-I.I.D. (Independent Identically and Distributed) data distribution and dynamic wireless transmission conditions. Such learning heterogeneity greatly harms the learning performance, e.g., convergence and learning accuracy. Therefore, we introduce the AMA-FES (adaptive-mixing aggregation, feature-extractor sharing) framework with an asynchronous aggregation scheme to address these challenges. To mitigate the impact of the non-I.I.D. data, we propose the AMA scheme to maintain the training stability by compromising between the previous global model and the synchronised local model updates, avoiding abrupt changes to a completely new model. To reduce computation load, we introduce the FES scheme, enabling the computing-limited devices to update only the classifier. To address the asynchronous model updates caused by the transmission delay, we perform asynchronous aggregation with staleness-based weighting. We implement the AMA-FES framework in a practical scenario where mobile UAVs act as FL training clients to conduct image classification tasks. The experimental results validate the effectiveness of the AMA-FES scheme in restoring training stability and learning accuracy without causing extra computation or communication expenditures in heterogeneous wireless networks.

Keywords