PLoS Computational Biology (May 2024)

Versatile multiple object tracking in sparse 2D/3D videos via deformable image registration.

  • James Ryu,
  • Amin Nejatbakhsh,
  • Mahdi Torkashvand,
  • Sahana Gangadharan,
  • Maedeh Seyedolmohadesin,
  • Jinmahn Kim,
  • Liam Paninski,
  • Vivek Venkatachalam

DOI
https://doi.org/10.1371/journal.pcbi.1012075
Journal volume & issue
Vol. 20, no. 5
p. e1012075

Abstract

Read online

Tracking body parts in behaving animals, extracting fluorescence signals from cells embedded in deforming tissue, and analyzing cell migration patterns during development all require tracking objects with partially correlated motion. As dataset sizes increase, manual tracking of objects becomes prohibitively inefficient and slow, necessitating automated and semi-automated computational tools. Unfortunately, existing methods for multiple object tracking (MOT) are either developed for specific datasets and hence do not generalize well to other datasets, or require large amounts of training data that are not readily available. This is further exacerbated when tracking fluorescent sources in moving and deforming tissues, where the lack of unique features and sparsely populated images create a challenging environment, especially for modern deep learning techniques. By leveraging technology recently developed for spatial transformer networks, we propose ZephIR, an image registration framework for semi-supervised MOT in 2D and 3D videos. ZephIR can generalize to a wide range of biological systems by incorporating adjustable parameters that encode spatial (sparsity, texture, rigidity) and temporal priors of a given data class. We demonstrate the accuracy and versatility of our approach in a variety of applications, including tracking the body parts of a behaving mouse and neurons in the brain of a freely moving C. elegans. We provide an open-source package along with a web-based graphical user interface that allows users to provide small numbers of annotations to interactively improve tracking results.