Complex & Intelligent Systems (Aug 2023)

A composition–decomposition based federated learning

  • Chaoli Sun,
  • Xiaojun Wang,
  • Junwei Ma,
  • Gang Xie

DOI
https://doi.org/10.1007/s40747-023-01198-x
Journal volume & issue
Vol. 10, no. 1
pp. 1027 – 1042

Abstract

Read online

Abstract Federated learning has been shown to be efficient for training a global model without needing to collect all data from multiple entities to the centralized server. However, the model performance, communication traffic, and data privacy and security are still the focus of federated learning after it has been developed. In this paper, a composition–decomposition based federated learning, denoted as CD-FL, is proposed. In the CD-FL approach, the global model, composed of K sub-models with the same framework, will be decomposed and broadcast to all clients. Each client will randomly choose a sub-model, update its parameters using its own dataset, and upload this sub-model to the server. All sub-models, including the sub-models before and after updating, will be clustered into K clusters to form the global model of the next round. Experimental results on Fashion-MNIST, CIFAR-10, EMNIST, and Tiny-IMAGENET datasets show the efficiency of the model performance and communication traffic of the proposed method.

Keywords