Université Paris-Saclay, Inria, CEA, Palaiseau, France; Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
Oleh Kozynets
Université Paris-Saclay, Inria, CEA, Palaiseau, France
David Sabbagh
Université Paris-Saclay, Inria, CEA, Palaiseau, France; Inserm, UMRS-942, Paris Diderot University, Paris, France; Department of Anaesthesiology and Critical Care, Lariboisière Hospital, Assistance Publique Hôpitaux de Paris, Paris, France
Guillaume Lemaître
Université Paris-Saclay, Inria, CEA, Palaiseau, France
Electrophysiological methods, that is M/EEG, provide unique views into brain health. Yet, when building predictive models from brain data, it is often unclear how electrophysiology should be combined with other neuroimaging methods. Information can be redundant, useful common representations of multimodal data may not be obvious and multimodal data collection can be medically contraindicated, which reduces applicability. Here, we propose a multimodal model to robustly combine MEG, MRI and fMRI for prediction. We focus on age prediction as a surrogate biomarker in 674 subjects from the Cam-CAN dataset. Strikingly, MEG, fMRI and MRI showed additive effects supporting distinct brain-behavior associations. Moreover, the contribution of MEG was best explained by cortical power spectra between 8 and 30 Hz. Finally, we demonstrate that the model preserves benefits of stacking when some data is missing. The proposed framework, hence, enables multimodal learning for a wide range of biomarkers from diverse types of brain signals.