Nature Communications (Nov 2023)

Deep learning at the edge enables real-time streaming ptychographic imaging

  • Anakha V. Babu,
  • Tao Zhou,
  • Saugat Kandel,
  • Tekin Bicer,
  • Zhengchun Liu,
  • William Judge,
  • Daniel J. Ching,
  • Yi Jiang,
  • Sinisa Veseli,
  • Steven Henke,
  • Ryan Chard,
  • Yudong Yao,
  • Ekaterina Sirazitdinova,
  • Geetika Gupta,
  • Martin V. Holt,
  • Ian T. Foster,
  • Antonino Miceli,
  • Mathew J. Cherukara

DOI
https://doi.org/10.1038/s41467-023-41496-z
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 9

Abstract

Read online

Abstract Coherent imaging techniques provide an unparalleled multi-scale view of materials across scientific and technological fields, from structural materials to quantum devices, from integrated circuits to biological cells. Driven by the construction of brighter sources and high-rate detectors, coherent imaging methods like ptychography are poised to revolutionize nanoscale materials characterization. However, these advancements are accompanied by significant increase in data and compute needs, which precludes real-time imaging, feedback and decision-making capabilities with conventional approaches. Here, we demonstrate a workflow that leverages artificial intelligence at the edge and high-performance computing to enable real-time inversion on X-ray ptychography data streamed directly from a detector at up to 2 kHz. The proposed AI-enabled workflow eliminates the oversampling constraints, allowing low-dose imaging using orders of magnitude less data than required by traditional methods.