IEEE Open Journal of the Communications Society (Jan 2024)

Coded Federated Learning for Communication-Efficient Edge Computing: A Survey

  • Yiqian Zhang,
  • Tianli Gao,
  • Congduan Li,
  • Chee Wei Tan

DOI
https://doi.org/10.1109/OJCOMS.2024.3423362
Journal volume & issue
Vol. 5
pp. 4098 – 4124

Abstract

Read online

In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources required for each worker. However, in distributed systems, the presence of slow machines, commonly known as stragglers, or failed links can lead to prolonged runtimes and diminished performance. This survey explores the application of coding techniques in DML and coded edge computing in the distributed system to enhance system speed, robustness, privacy, and more. Notably, the study delves into coding in Federated Learning (FL), a specialized distributed learning system. Coding involves introducing redundancy into the system and identifying multicast opportunities. There exists a tradeoff between computation and communication costs. The survey establishes that coding is a promising approach for building robust and secure distributed systems with low latency.

Keywords