Image Analysis and Stereology (Mar 2024)
Improving the Learning Performance of Client’s Local Distribution in Cyclic Federated Learning
Abstract
Cyclic federated learning based on distribution information sharing and knowledge distillation (CFL_DS_KD) aims to address the challenges of non-iid data distribution and reduce communication requirements. However, when client data is extremely heterogeneous and scarce, it becomes challenging for clients to fully learn the distribution of local data using GANs, thereby affecting the overall model performance. To overcome this limitation, we propose a transfer learning approach where clients first pretrain their generators on a source domain and then fine-tune them on their local datasets. Our results on the classification of Alzheimer’s disease demonstrate that this method effectively improves client distribution learning performance and enhances the overall model performance.
Keywords