Optics (Mar 2025)
Single-Shot Wavefront Sensing in Focal Plane Imaging Using Transformer Networks
Abstract
Wavefront sensing is an essential technique in optical imaging, adaptive optics, and atmospheric turbulence correction. Traditional wavefront reconstruction methods, including the Gerchberg–Saxton (GS) algorithm and phase diversity (PD) techniques, are often limited by issues such as low inversion accuracy, slow convergence, and the presence of multiple possible solutions. Recent developments in deep learning have led to new methods, although conventional CNN-based models still face challenges in effectively capturing global context. To overcome these limitations, we propose a Transformer-based single-shot wavefront sensing method, which directly reconstructs wavefront aberrations from focal plane intensity images. Our model integrates a Normalization-based Attention Module (NAM) into the CoAtNet architecture, which strengthens feature extraction and leads to more accurate wavefront characterization. Experimental results in both simulated and real-world conditions indicate that our method achieves a 4.5% reduction in normalized wavefront error (NWE) compared to ResNet34, suggesting improved performance over conventional deep learning models. Additionally, by leveraging Walsh function modulation, our approach resolves the multiple-solution problem inherent in phase retrieval techniques. The proposed model achieves high accuracy, fast convergence, and simplicity in implementation, making it a promising solution for wavefront sensing applications.
Keywords