Scientific Data (Dec 2024)

Generative models of MRI-derived neuroimaging features and associated dataset of 18,000 samples

  • Sai Spandana Chintapalli,
  • Rongguang Wang,
  • Zhijian Yang,
  • Vasiliki Tassopoulou,
  • Fanyang Yu,
  • Vishnu Bashyam,
  • Guray Erus,
  • Pratik Chaudhari,
  • Haochang Shou,
  • Christos Davatzikos

DOI
https://doi.org/10.1038/s41597-024-04157-4
Journal volume & issue
Vol. 11, no. 1
pp. 1 – 10

Abstract

Read online

Abstract Availability of large and diverse medical datasets is often challenged by privacy and data sharing restrictions. Successful application of machine learning techniques for disease diagnosis, prognosis, and precision medicine, requires large amounts of data for model building and optimization. To help overcome such limitations in the context of brain MRI, we present GenMIND: a collection of generative models of normative regional volumetric features derived from structural brain imaging. GenMIND models are trained on real brain imaging regional volumetric measures from the iSTAGING consortium, which encompasses over 40,000 MRI scans across 13 studies, incorporating covariates such as age, sex, and race. Leveraging GenMIND, we produce and offer 18,000 synthetic samples spanning the adult lifespan (ages 22-90 years), alongside the model’s capability to generate unlimited data. Experimental results indicate that samples generated from GenMIND align well with the distributions observed in real data. Most importantly, the generated normative data significantly enhances the accuracy of downstream machine learning models on tasks such as disease classification. Dataset and the generative models are publicly available.