The Open Journal of Astrophysics (May 2024)

Neural style transfer of weak lensing mass maps

  • Masato Shirasaki,
  • Shiro Ikeda

Journal volume & issue
Vol. 7

Abstract

Read online

We propose a new generative model of projected cosmic mass density maps inferred from weak gravitational lensing observations of distant galaxies (weak lensing mass maps). We construct the model based on a neural style transfer so that it can transform Gaussian weak lensing mass maps into deeply non-Gaussian counterparts as predicted in ray-tracing lensing simulations. We develop an unpaired image-to-image translation method with Cycle-Consistent Generative Adversarial Networks (Cycle GAN), which learn efficient mapping from an input domain to a target domain. Our model is designed to enjoy important advantages; it is trainable with no need for paired simulation data, flexible to make the input domain visually meaningful, and expandable to rapidly-produce a map with a larger sky coverage than training data without additional learning. Using 10,000 lensing simulations, we find that appropriate labeling of training data based on field variance allows the model to reproduce a correct scatter in summary statistics for weak lensing mass maps. Compared with a popular log-normal model, our model improves in predicting the statistical natures of three-point correlations and local properties of rare high-density regions. We also demonstrate that our model enables us to produce a continuous map with a sky coverage of $\sim166\, \mathrm{deg}^2$ but similar non-Gaussian features to training data covering $\sim12\, \mathrm{deg}^2$ in a GPU minute. Hence, our model can be beneficial to massive productions of synthetic weak lensing mass maps, which is of great importance in future precise real-world analyses.