IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)

MS-GAN: Learn to Memorize Scene for Unpaired SAR-to-Optical Image Translation

  • Zhe Guo,
  • Zhibo Zhang,
  • Qinglin Cai,
  • Jiayi Liu,
  • Yangyu Fan,
  • Shaohui Mei

DOI
https://doi.org/10.1109/JSTARS.2024.3411691
Journal volume & issue
Vol. 17
pp. 11467 – 11484

Abstract

Read online

Synthetic aperture radar (SAR) and optical sensing are two important means of Earth observation. SAR-to-optical image translation (S2OIT) can integrate the advantages of both and assist SAR image interpretation under all-day and all-weather conditions. The existing S2OIT methods generally follow a paired training paradigm, which is difficult when dealing with the unpaired S2OIT application scenarios. Moreover, the generator and discriminator in current S2OIT methods have insufficient scene memory for SAR images, resulting in regional landform deformation in the generated images. To address these issues, we propose a novel generative adversarial network capable of memorizing scene for unpaired S2OIT called MS-GAN. The cycle learning framework based on cycle generative adversarial network for unpaired S2OIT is designed to construct the translation mapping between unpaired SAR and optical images. The multiscale representation generator is constructed for multiscale fusion and utilization of scene features of SAR images. The proposed multireceptive field discriminator has the ability to enhance scene memory and generate higher quality optical images in different landforms. In addition, the designed subbands shrinkage denoising module can further suppress the effect of speckle noise in SAR images on the quality of the generated results. Extensive experiments conducted on three challenging datasets SEN1-2, WHU-SEN-City, and QXS-SAROPT demonstrate that the proposed MS-GAN outperforms the state-of-the-art methods on both subjective and objective evaluation metrics.

Keywords