Remote Sensing (Sep 2024)

MDSCNN: Remote Sensing Image Spatial–Spectral Fusion Method via Multi-Scale Dual-Stream Convolutional Neural Network

  • Wenqing Wang,
  • Fei Jia,
  • Yifei Yang,
  • Kunpeng Mu,
  • Han Liu

DOI
https://doi.org/10.3390/rs16193583
Journal volume & issue
Vol. 16, no. 19
p. 3583

Abstract

Read online

Pansharpening refers to enhancing the spatial resolution of multispectral images through panchromatic images while preserving their spectral features. However, existing traditional methods or deep learning methods always have certain distortions in the spatial or spectral dimensions. This paper proposes a remote sensing spatial–spectral fusion method based on a multi-scale dual-stream convolutional neural network, which includes feature extraction, feature fusion, and image reconstruction modules for each scale. In terms of feature fusion, we propose a multi cascade module to better fuse image features. We also design a new loss function aim at enhancing the high degree of consistency between fused images and reference images in terms of spatial details and spectral information. To validate its effectiveness, we conduct thorough experimental analyses on two widely used remote sensing datasets: GeoEye-1 and Ikonos. Compared with the nine leading pansharpening techniques, the proposed method demonstrates superior performance in multiple key evaluation metrics.

Keywords