IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2025)

Unsupervised Image Super-Resolution for High-Resolution Satellite Imagery via Omnidirectional Real-to-Synthetic Domain Translation

  • Minkyung Chung,
  • Yongil Kim

DOI
https://doi.org/10.1109/JSTARS.2025.3530959
Journal volume & issue
Vol. 18
pp. 4427 – 4445

Abstract

Read online

Image super-resolution (SR) aims to enhance the spatial resolution of images and overcome the hardware limitations of imaging systems. While deep-learning networks have significantly improved SR performance, obtaining paired low-resolution (LR) and high-resolution (HR) images for supervised learning remains challenging in real-world scenarios. In this article, we propose a novel unsupervised image super-resolution model for real-world remote sensing images, specifically focusing on HR satellite imagery. Our model, the bicubic-downsampled LR image-guided generative adversarial network for unsupervised learning (BLG-GAN-U), divides the SR process into two stages: LR image domain translation and image super-resolution. To implement this division, the model integrates omnidirectional real-to-synthetic domain translation with training strategies such as frequency separation and guided filtering. The model was evaluated through comparative analyses and ablation studies using real-world LR–HR datasets from WorldView-3 HR satellite imagery. The experimental results demonstrate that BLG-GAN-U effectively generates high-quality SR images with excellent perceptual quality and reasonable image fidelity, even with a relatively smaller network capacity.

Keywords