Mathematics (Jun 2023)

A Communication-Efficient Federated Text Classification Method Based on Parameter Pruning

  • Zheng Huo,
  • Yilin Fan,
  • Yaxin Huang

DOI
https://doi.org/10.3390/math11132804
Journal volume & issue
Vol. 11, no. 13
p. 2804

Abstract

Read online

Text classification is an important application of machine learning. This paper proposes a communication-efficient federated text classification method based on parameter pruning. In the federated learning architecture, the data distribution of different participants is not independent and identically distributed; a federated word embedding model FedW2V is proposed. Then the TextCNN model is extended to the federated architecture. To reduce the communication cost of the federated TextCNN model, a parameter pruning algorithm called FedInitPrune is proposed, which reduces the amount of communication data both in the uplink and downlink during the parameter transmission phase. The algorithms are tested on real-world datasets. The experimental results show that when the text classification model accuracy reduces by less than 2%, the amount of federated learning communication parameters can be reduced by 74.26%.

Keywords