npj Digital Medicine (Sep 2021)

Privacy-first health research with federated learning

  • Adam Sadilek,
  • Luyang Liu,
  • Dung Nguyen,
  • Methun Kamruzzaman,
  • Stylianos Serghiou,
  • Benjamin Rader,
  • Alex Ingerman,
  • Stefan Mellem,
  • Peter Kairouz,
  • Elaine O. Nsoesie,
  • Jamie MacFarlane,
  • Anil Vullikanti,
  • Madhav Marathe,
  • Paul Eastham,
  • John S. Brownstein,
  • Blaise Aguera y. Arcas,
  • Michael D. Howell,
  • John Hernandez

DOI
https://doi.org/10.1038/s41746-021-00489-2
Journal volume & issue
Vol. 4, no. 1
pp. 1 – 8

Abstract

Read online

Abstract Privacy protection is paramount in conducting health research. However, studies often rely on data stored in a centralized repository, where analysis is done with full access to the sensitive underlying content. Recent advances in federated learning enable building complex machine-learned models that are trained in a distributed fashion. These techniques facilitate the calculation of research study endpoints such that private data never leaves a given device or healthcare system. We show—on a diverse set of single and multi-site health studies—that federated models can achieve similar accuracy, precision, and generalizability, and lead to the same interpretation as standard centralized statistical models while achieving considerably stronger privacy protections and without significantly raising computational costs. This work is the first to apply modern and general federated learning methods that explicitly incorporate differential privacy to clinical and epidemiological research—across a spectrum of units of federation, model architectures, complexity of learning tasks and diseases. As a result, it enables health research participants to remain in control of their data and still contribute to advancing science—aspects that used to be at odds with each other.