International Journal of Digital Earth (Dec 2023)

Multiresolution generative adversarial networks with bidirectional adaptive-stage progressive guided fusion for remote sensing image

  • Yuanyuan Wu,
  • Yuchun Li,
  • Mengxing Huang,
  • Siling Feng

DOI
https://doi.org/10.1080/17538947.2023.2241441
Journal volume & issue
Vol. 16, no. 1
pp. 2962 – 2997

Abstract

Read online

Remote sensing image (RSI) with concurrently high spatial, temporal, and spectral resolutions cannot be produced by a single sensor. Multisource RSI fusion is a convenient technique to realize high spatial resolution multispectral (MS) images (spatial spectral fusion, i.e. SSF) and high temporal and spatial resolution MS images (spatiotemporal fusion, i.e. STF). Currently, deep learning-based fusion models can only implement SSF or STF, lacking models that perform both SSF and STF. Multiresolution generative adversarial networks with bidirectional adaptive-stage progressive guided fusion (BAPGF) for RSI are proposed to implement both SSF and STF, namely BPF-MGAN. A bidirectional adaptive-stage feature extraction architecture in fine-scale-to-coarse-scale and coarse-scale-to-fine-scale modes is introduced. The designed BAPGF introduces a previous fusion result-oriented cross-stage-level dual-residual attention fusion strategy to enhance critical information and suppress superfluous information. Adaptive resolution U-shaped discriminators are implemented to feed multiresolution context into the generator. A generalized multitask loss function unlimited by no-reference images is developed to strengthen the model via constraints on the multiscale feature, structural, and content similarities. The BPF-MGAN model is validated on SSF datasets and STF datasets. Compared with the state-of-the-art SSF and STF models, results demonstrate the superior performance of the proposed BPF-MGAN model in both subjective and objective evaluations.

Keywords