Frontiers in Nuclear Medicine (Nov 2024)

Contrastive learning for neural fingerprinting from limited neuroimaging data

  • Nikolas Kampel,
  • Nikolas Kampel,
  • Nikolas Kampel,
  • Farah Abdellatif,
  • Farah Abdellatif,
  • N. Jon Shah,
  • N. Jon Shah,
  • N. Jon Shah,
  • N. Jon Shah,
  • Irene Neuner,
  • Irene Neuner,
  • Irene Neuner,
  • Irene Neuner,
  • Jürgen Dammers,
  • Jürgen Dammers,
  • Jürgen Dammers

DOI
https://doi.org/10.3389/fnume.2024.1332747
Journal volume & issue
Vol. 4

Abstract

Read online

IntroductionNeural fingerprinting is a technique used to identify individuals based on their unique brain activity patterns. While deep learning techniques have been demonstrated to outperform traditional correlation-based methods, they often require retraining to accommodate new subjects. Furthermore, the limited availability of samples in neuroscience research can impede the quick adoption of deep learning methods, presenting a challenge for their broader application in neural fingerprinting.MethodsThis study addresses these challenges by using contrastive learning to eliminate the need for retraining with new subjects and developing a data augmentation methodology to enhance model robustness in limited sample size conditions. We utilized the LEMON dataset, comprising 3 Tesla MRI and resting-state fMRI scans from 138 subjects, to compute functional connectivity as a baseline for fingerprinting performance based on correlation metrics. We adapted a recent deep learning model by incorporating data augmentation with short random temporal segments for training and reformulated the fingerprinting task as a contrastive problem, comparing the efficacy of contrastive triplet loss against conventional cross-entropy loss.ResultsThe results of this study confirm that deep learning methods can significantly improve fingerprinting performance over correlation-based methods, achieving an accuracy of about 98% in identifying a single subject out of 138 subjects utilizing 39 different functional connectivity profiles.DiscussionThe contrastive method showed added value in the “leave subject out” scenario, demonstrating flexibility comparable to correlation-based methods and robustness across different data sizes. These findings suggest that contrastive learning and data augmentation offer a scalable solution for neural fingerprinting, even with limited sample sizes.

Keywords