Geo-spatial Information Science (Sep 2024)

Domain adaptation hyperspectral image fusion based on spatial-spectral domain separation

  • Yucheng Sun,
  • Yong Ma,
  • Yuan Yao,
  • Xiaoguang Mei,
  • Jun Huang,
  • Jiayi Ma

DOI
https://doi.org/10.1080/10095020.2024.2380476

Abstract

Read online

Hyperspectral datasets captured by airborne are one of the important data sources in the field of hyperspectral image fusion. However, the limited number of data samples often hinders the optimal performance of deep learning-based methods on these datasets. Domain adaptation is expected to alleviate the issue of data scarcity. However, there are two issues that need to be addressed in applying domain adaptation to hyperspectral image fusion, which are the heterogeneity and spatial degradation diversity of hyperspectral data. In this paper, we propose a domain adaptation hyperspectral image fusion network based on spatial-spectral domain separation. The model, constructed with three-dimensional convolutional layers, effectively addresses the challenge of hyperspectral data heterogeneity. Compared with 2D convolution, 3D convolution is able to consider both spatial and spectral dimensions and extract spatial-spectral features efficiently. Based on this, we design a domain-separation architecture to extract domain-invariant and domain-private features in the spatial-spectral domain. The architecture achieves domain adaptation by extracting domain-invariant features from both datasets to obtain the prior knowledge in the source dataset and apply it to the target dataset. Additionally, the spatial degradation of different datasets varies due to distinct shooting conditions during the acquisition. Therefore, the fusion algorithms should be robust to various degradation scenarios. To tackle this challenge, we design the Observation Module to predict degradation information and the Degradation Information Modulation Module to apply it to the input, thereby enhancing the network’s robustness. Experiments on various datasets demonstrate that our method is qualitatively and quantitatively superior to state-of-the-art methods.

Keywords