Mathematics (Aug 2024)

Symmetric ADMM-Based Federated Learning with a Relaxed Step

  • Jinglei Lu,
  • Ya Zhu,
  • Yazheng Dang

DOI
https://doi.org/10.3390/math12172661
Journal volume & issue
Vol. 12, no. 17
p. 2661

Abstract

Read online

Federated learning facilitates the training of global models in a distributed manner without requiring the sharing of raw data. This paper introduces two novel symmetric Alternating Direction Method of Multipliers (ADMM) algorithms for federated learning. The two algorithms utilize a convex combination of current local and global variables to generate relaxed steps to improve computational efficiency. They also integrate two dual-update steps with varying relaxation factors into the ADMM framework to boost the accuracy and the convergence rate. Another key feature is the use of weak parametric assumptions to enhance computational feasibility. Furthermore, the global update in the second algorithm occurs only at certain steps (e.g., at steps that are a multiple of a pre-defined integer) to improve communication efficiency. Theoretical analysis demonstrates linear convergence under reasonable conditions, and experimental results confirm the superior convergence and heightened efficiency of the proposed algorithms compared to existing methodologies.

Keywords