IEEE Access (Jan 2023)

FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning

  • Yikai Xu,
  • Hongbo Fan

DOI
https://doi.org/10.1109/ACCESS.2023.3294812
Journal volume & issue
Vol. 11
pp. 72409 – 72417

Abstract

Read online

For most healthcare organizations, a significant challenge today is predicting diseases with incomplete data information, often resulting in isolation. Federated learning (FL) solves the issue of data silos by enabling remote local machines to train a globally optimal model collaboratively without the need for sharing data. In this research, we present FedDK, a serverless framework designed to obtain personalized models for each federation through data from local federations using convolutional neural networks and training through FL. Our approach involves using convolutional neural networks (CNNs) to accumulate common knowledge and transfer it using knowledge distillation, which helps prevent common knowledge forgetting. Additionally, the missing common knowledge is filled circularly between each federation, culminating in a personalized model for each group. This novel design leverages federated, deep, and integrated learning methods to produce more accurate machine-learning models. Our federated model exhibits superior performance to local and baseline FL methods, achieving significant advantages.

Keywords