The Astronomical Journal (Jan 2024)

Using AI for Wave-front Estimation with the Rubin Observatory Active Optics System

  • John Franklin Crenshaw,
  • Andrew J. Connolly,
  • Joshua E. Meyers,
  • J. Bryce Kalmbach,
  • Guillem Megias Homar,
  • Tiago Ribeiro,
  • Krzysztof Suberlak,
  • Sandrine Thomas,
  • Te-Wei Tsai

DOI
https://doi.org/10.3847/1538-3881/ad1661
Journal volume & issue
Vol. 167, no. 2
p. 86

Abstract

Read online

The Vera C. Rubin Observatory will, over a period of 10 yr, repeatedly survey the southern sky. To ensure that images generated by Rubin meet the quality requirements for precision science, the observatory will use an active-optics system (AOS) to correct for alignment and mirror surface perturbations introduced by gravity and temperature gradients in the optical system. To accomplish this, Rubin will use out-of-focus images from sensors located at the edge of the focal plane to learn and correct for perturbations to the wave front. We have designed and integrated a deep-learning (DL) model for wave-front estimation into the AOS pipeline. In this paper, we compare the performance of this DL approach to Rubin’s baseline algorithm when applied to images from two different simulations of the Rubin optical system. We show the DL approach is faster and more accurate, achieving the atmospheric error floor both for high-quality images and low-quality images with heavy blending and vignetting. Compared to the baseline algorithm, the DL model is 40× faster, the median error 2× better under ideal conditions, 5× better in the presence of vignetting by the Rubin camera, and 14× better in the presence of blending in crowded fields. In addition, the DL model surpasses the required optical quality in simulations of the AOS closed loop. This system promises to increase the survey area useful for precision science by up to 8%. We discuss how this system might be deployed when commissioning and operating Rubin.

Keywords