eLife (Mar 2021)

3DeeCellTracker, a deep learning-based pipeline for segmenting and tracking cells in 3D time lapse images

  • Chentao Wen,
  • Takuya Miura,
  • Venkatakaushik Voleti,
  • Kazushi Yamaguchi,
  • Motosuke Tsutsumi,
  • Kei Yamamoto,
  • Kohei Otomo,
  • Yukako Fujie,
  • Takayuki Teramoto,
  • Takeshi Ishihara,
  • Kazuhiro Aoki,
  • Tomomi Nemoto,
  • Elizabeth MC Hillman,
  • Koutarou D Kimura

DOI
https://doi.org/10.7554/eLife.59187
Journal volume & issue
Vol. 10

Abstract

Read online

Despite recent improvements in microscope technologies, segmenting and tracking cells in three-dimensional time-lapse images (3D + T images) to extract their dynamic positions and activities remains a considerable bottleneck in the field. We developed a deep learning-based software pipeline, 3DeeCellTracker, by integrating multiple existing and new techniques including deep learning for tracking. With only one volume of training data, one initial correction, and a few parameter changes, 3DeeCellTracker successfully segmented and tracked ~100 cells in both semi-immobilized and ‘straightened’ freely moving worm's brain, in a naturally beating zebrafish heart, and ~1000 cells in a 3D cultured tumor spheroid. While these datasets were imaged with highly divergent optical systems, our method tracked 90–100% of the cells in most cases, which is comparable or superior to previous results. These results suggest that 3DeeCellTracker could pave the way for revealing dynamic cell activities in image datasets that have been difficult to analyze.

Keywords