Frontiers in Neuroinformatics (Sep 2024)

Efficient federated learning for distributed neuroimaging data

  • Bishal Thapaliya,
  • Bishal Thapaliya,
  • Riyasat Ohib,
  • Riyasat Ohib,
  • Eloy Geenjaar,
  • Eloy Geenjaar,
  • Jingyu Liu,
  • Jingyu Liu,
  • Vince Calhoun,
  • Vince Calhoun,
  • Vince Calhoun,
  • Sergey M. Plis,
  • Sergey M. Plis

DOI
https://doi.org/10.3389/fninf.2024.1430987
Journal volume & issue
Vol. 18

Abstract

Read online

Recent advancements in neuroimaging have led to greater data sharing among the scientific community. However, institutions frequently maintain control over their data, citing concerns related to research culture, privacy, and accountability. This creates a demand for innovative tools capable of analyzing amalgamated datasets without the need to transfer actual data between entities. To address this challenge, we propose a decentralized sparse federated learning (FL) strategy. This approach emphasizes local training of sparse models to facilitate efficient communication within such frameworks. By capitalizing on model sparsity and selectively sharing parameters between client sites during the training phase, our method significantly lowers communication overheads. This advantage becomes increasingly pronounced when dealing with larger models and accommodating the diverse resource capabilities of various sites. We demonstrate the effectiveness of our approach through the application to the Adolescent Brain Cognitive Development (ABCD) dataset.

Keywords