Nature Communications (Sep 2023)

Deep learning-enabled realistic virtual histology with ultraviolet photoacoustic remote sensing microscopy

  • Matthew T. Martell,
  • Nathaniel J. M. Haven,
  • Brendyn D. Cikaluk,
  • Brendon S. Restall,
  • Ewan A. McAlister,
  • Rohan Mittal,
  • Benjamin A. Adam,
  • Nadia Giannakopoulos,
  • Lashan Peiris,
  • Sveta Silverman,
  • Jean Deschenes,
  • Xingyu Li,
  • Roger J. Zemp

DOI
https://doi.org/10.1038/s41467-023-41574-2
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 17

Abstract

Read online

Abstract The goal of oncologic surgeries is complete tumor resection, yet positive margins are frequently found postoperatively using gold standard H&E-stained histology methods. Frozen section analysis is sometimes performed for rapid intraoperative margin evaluation, albeit with known inaccuracies. Here, we introduce a label-free histological imaging method based on an ultraviolet photoacoustic remote sensing and scattering microscope, combined with unsupervised deep learning using a cycle-consistent generative adversarial network for realistic virtual staining. Unstained tissues are scanned at rates of up to 7 mins/cm2, at resolution equivalent to 400x digital histopathology. Quantitative validation suggests strong concordance with conventional histology in benign and malignant prostate and breast tissues. In diagnostic utility studies we demonstrate a mean sensitivity and specificity of 0.96 and 0.91 in breast specimens, and respectively 0.87 and 0.94 in prostate specimens. We also find virtual stain quality is preferred (P = 0.03) compared to frozen section analysis in a blinded survey of pathologists.