Jisuanji kexue (Dec 2022)

Federated Learning Optimization Method for Dynamic Weights in Edge Scenarios

  • CHENG Fan, WANG Rui-jin, ZHANG Feng-li

DOI
https://doi.org/10.11896/jsjkx.220700136
Journal volume & issue
Vol. 49, no. 12
pp. 53 – 58

Abstract

Read online

As a new computing paradigm,edge computing provides computing and storage services at the edge of the network compared to traditional cloud computing model.It has the characteristics of high reliability and low latency.However,there are still some problems in privacy protection and data processing.As a distributed machine learning model,federated learning can well solve the problems of inconsistent data distribution and data privacy in edge computing scenarios,but it still faces challenges in equipment heterogeneity,data heterogeneity and communication,such as model offset,the convergence effect is poor,and the calculation results of some devices are lost.In order to solve the above problems,a federated learning optimization algorithm with dynamic weights(FedDw) is proposed,which focuses on the service quality of the equipment,reduces the heterogeneous impact caused by the participation of some equipments due to inconsistent training speed,and determines the proportion in the final mo-del aggregation according to the service quality,so as to ensure that the aggregation results are more robust in complex real situations.Through experiments,the two excellent federated learning algorithms,FedProx and Scaffold,are compared on the real data sets of 10 regional weather stations.The results show that the FedDw algorithm has better comprehensive performance.

Keywords