Jisuanji kexue (Dec 2022)

Efficient Federated Learning Scheme Based on Background Optimization

  • GUO Gui-juan, TIAN Hui, WANG Tian, JIA Wei-jia

DOI
https://doi.org/10.11896/jsjkx.220600237
Journal volume & issue
Vol. 49, no. 12
pp. 40 – 45

Abstract

Read online

Federated learning can effectively ensure the privacy and security of data because it trains data locally on the client.The study of federal learning has made great progress.However,due to the existence of non-independent and identically distributed data,unbalanced data amount and data type,the client will inevitably have problems such as lack of accuracy and low training efficiency when using local data for training.In order to deal with the problem that the federal learning efficiency is reduced due to the difference of the federal learning background,this paper proposes an efficient federated learning scheme based on background optimization to improve the accuracy of the local model in the terminal device,so as to reduce the communication cost and improve the training efficiency of the whole model.Specifically,the first device and the second device are selected according to the diffe-rence in accuracy in different environments,and the irrelevance between the first device model and the global model (hereafter we collectively refer to as the difference value) is taken as the standard difference value.Whether the second device uploads the local model is determined by the value of the difference between the second device and the first device.Experimental results show that compared with the traditional federated learning,the proposed scheme performs better than the federated average algorithm in common federated learning scenarios,and improves the accuracy by about 7.5% in the MINIST data sets.In the CIFAR-10 data set,accuracy improves by about 10%.

Keywords