Array (Sep 2022)

CE-Fed: Communication efficient multi-party computation enabled federated learning

  • Renuga Kanagavelu,
  • Qingsong Wei,
  • Zengxiang Li,
  • Haibin Zhang,
  • Juniarto Samsudin,
  • Yechao Yang,
  • Rick Siow Mong Goh,
  • Shangguang Wang

Journal volume & issue
Vol. 15
p. 100207

Abstract

Read online

Federated learning (FL) allows a number of parties collectively train models without revealing private datasets. There is a possibility of extracting personal or confidential data from the shared models even-though sharing of raw data is prevented by federated learning. Secure Multi Party Computation (MPC) is leveraged to aggregate the locally-trained models in a privacy preserving manner. However, it results in high communication cost and poor scalability in a decentralized environment. We design a novel communication-efficient MPC enabled federated learning called CE-Fed. In particular, the proposed CE-Fed is a hierarchical mechanism which forms model aggregation committee with a small number of members and aggregates the global model only among committee members, instead of all participants. We develop a prototype and demonstrate the effectiveness of our mechanism with different datasets. Our proposed CE-Fed achieves high accuracy, communication efficiency and scalability without compromising privacy.

Keywords