International Journal of Applied Earth Observations and Geoinformation (Dec 2023)

An adaptive multi-perceptual implicit sampling for hyperspectral and multispectral remote sensing image fusion

  • Chunyu Zhu,
  • Rongyuan Dai,
  • Liwei Gong,
  • Liangbo Gao,
  • Na Ta,
  • Qiong Wu

Journal volume & issue
Vol. 125
p. 103560

Abstract

Read online

Hyperspectral and multispectral remote sensing image fusion (HMIF) can effectively enhance image spatial-spectral information representation. However, existing deep learning algorithms usually use neighboring pixels for interpolation in the sampling process, which ignores the correlation of different receptive field pixels. To solve the above problem, this study proposes an adaptive multi-perceptual field implicit guided sampling generative adversarial network (AMGSGAN) to enhance the long-range perception of the sampling process. The generator consists of precoding, multi-perceptual field feature extraction, and adaptive guided implicit sampling modules. The precoding module encodes the features of the images to enhance the feature expression capability, the multi-perceptual field feature extraction module is used to extract the multi-perceptual field features of the images by setting different convolutional dilation rates, and the adaptive guided implicit sampling module utilizes pixels with different distances to interpolate the features of the pixels to be sampled and fuses the interpolation results adaptively to generate the fusion image. The discriminator incorporates the pure CNN architecture and employs techniques such as batch normalization layer and adaptive average pooling layer to ensure the stability and convergence speed of the network. Comparison experiments on several datasets show that the fusion performance of AMGSGAN is significantly better than the compared algorithms at 4x, 8x and 16x resolutions, which demonstrates the effectiveness of AMGSGAN. The code can be found at https://github.com/chunyuzhu/AMGSGAN.

Keywords