IEEE Access (Jan 2023)

Communication-Efficient Federated Learning for Power Load Forecasting in Electric IoTs

  • Zhengxiong Mao,
  • Hui Li,
  • Zuyuan Huang,
  • Chuanxu Yang,
  • Yanan Li,
  • Zihao Zhou

DOI
https://doi.org/10.1109/ACCESS.2023.3262171
Journal volume & issue
Vol. 11
pp. 47930 – 47939

Abstract

Read online

With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enhance the privacy of the raw power load data of clients by frequently transmitting model updates. Concerning the increasing communication burden of resource-heterogeneous clients resulting from frequent communication with the server, a communication-efficient federated learning algorithm based on Compressed Model Updates and Lazy uploAd (CMULA-FL) was proposed to reduce the communication cost. CMULA-FL also integrates the error compensation strategy to improve the model utility. First, the compression operator is used to compress the transmitted model updates, of which large norms are uploaded to reduce the communication cost of each epoch and transmission frequency. Second, by measuring the error of compression and lazy upload, the error is accumulated to the next epoch to improve the model utility. Finally, based on simulation experiments on the benchmark power load data, the results show that the communication cost decreases at least 60% with controlled loss of model prediction compared with baseline.

Keywords