IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)

Unsupervised Single-Generator CycleGAN-Based Pansharpening With Spatial-Spectral Degradation Modeling

  • Wenxiu Diao,
  • Mengying Jin,
  • Kai Zhang,
  • Liang Xiao

DOI
https://doi.org/10.1109/JSTARS.2023.3327169
Journal volume & issue
Vol. 16
pp. 10246 – 10263

Abstract

Read online

Supervised pansharpening methods require the ground truth, which is generally unavailable. Therefore, the popularity of unsupervised pansharpening methods has increased. Generative adversarial networks (GANs) are often employed for unsupervised pansharpening, although achieving precise control over the generation process to capture rich spatial and spectral details is challenging. CycleGAN introduces cycle consistency loss and utilizes the cooperative training of two generators and two discriminators to learn the mapping between different domains. This approach partially addresses the issue of limited control over the generated results in traditional GANs. Therefore, CycleGAN also can be employed to accomplish unsupervised pansharpening tasks. However, it is complicated to directly apply the network structure of CycleGAN to pansharpening. To address this issue, we integrate a process model capable of simulating spatial and spectral degradations into a single-generator CycleGAN, which can learn the target distribution. Specifically, we propose an unsupervised CycleGAN for pansharpening based on spatial and spectral degradations and consists of one lightweight generator and two discriminators. Then, the low-resolution multispectral and panchromatic images are considered as the spatial and spectral degradations of the high-resolution multispectral images. Besides, unsupervised loss functions consisting of cycle consistency, adversarial, spectral angle mapper, and edge enhancement losses are designed to preserve spectral and spatial information. The experimental results on the QuickBird, GeoEye-1, and GF-2 datasets show that the qualitative and quantitative analysis of the proposed method is comparable with most supervised methods and superior to most unsupervised methods.

Keywords