IEEE Access (Jan 2024)

FedDBO: A Novel Federated Learning Approach for Communication Cost and Data Heterogeneity Using Dung Beetle Optimizer

  • Dongyan Wang,
  • Limin Chen,
  • Xiaotong Lu,
  • Yidi Wang,
  • Yue Shen,
  • Jingjing Xu

DOI
https://doi.org/10.1109/ACCESS.2024.3379273
Journal volume & issue
Vol. 12
pp. 43396 – 43409

Abstract

Read online

As an emerging distributed machine learning technology, federated learning has gained widespread attention due to its critical privacy protection mechanism. However, it also faces challenges such as high communication costs and heterogeneous client data.In order to address the above issues. This paper proposes a federated learning approach based on the dung beetle optimizer, named FedDBO. In this method, the model parameters uploaded from clients to the server are transformed into model scores. In each round of training, only some of the clients with high model scores need to be selected to upload their parameters to the server, thus reducing communication costs; Simultaneously, a model retraining strategy is introduced. After aggregating the model parameters sent by clients, the server performs a second iterative training on the aggregated model using its own metadata, thereby reducing data heterogeneity and improving model performance. In addition, a proof of convergence is provided, demonstrating that the model aggregated by FedDBO converges to the aggregated model of FedAvg after each training round. Finally, experiments indicate that when simulating various data heterogeneous environments on datasets, FedDBO exhibits higher accuracy and better stability compared to three other algorithms: FedAvg, FedShare, and FedPSO.

Keywords