IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)

Deep Adaptive Phase Learning: Enhancing Synthetic Aperture Sonar Imagery Through Learned Coherent Autofocus

  • Isaac D. Gerg,
  • Daniel A. Cook,
  • Vishal Monga

DOI
https://doi.org/10.1109/JSTARS.2024.3393139
Journal volume & issue
Vol. 17
pp. 9517 – 9532

Abstract

Read online

Having well-focused synthetic aperture sonar (SAS) imagery is important for its accurate analysis and support of autonomous systems. Despite advances in motion estimation and image formation methods, there persists a need for robust autofocus algorithms deployed both topside and in situ embedded in unmanned underwater vehicles (UUVs) for real-time processing. This need stems from the fact that systematic focus errors are common in SAS and often result from misestimating sound speed in the medium or uncompensated vehicle motion. In this article, we use an SAS-specific convolutional neural network (CNN) to robustly and quickly autofocus SAS images. Our method, which we call deep adaptive phase learning (DAPL), explicitly utilizes the relationship between the $k$-space domain and the complex-valued SAS image to perform the autofocus operation in a manner distinctly different than existing optical image deblurring techniques that solely rely on magnitude-only imagery. We demonstrate that DAPL mitigates three types of systematic phase errors common to SAS platforms (and combinations thereof): quadratic phase error (QPE), sinusoidal error, and sawtooth error (i.e., yaw error). We show results for DAPL against a publicly available, real-world high-frequency SAS dataset, and also compare them against several existing techniques including phase gradient autofocus (PGA). Our results show that DAPL is competitive with or outperforms state-of-the-art alternatives without requiring manual parameter tuning.

Keywords