IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)

An Adaptive Multiscale Gaussian Co-Occurrence Filtering Decomposition Method for Multispectral and SAR Image Fusion

  • Xunqiang Gong,
  • Zhaoyang Hou,
  • Ailong Ma,
  • Yanfei Zhong,
  • Meng Zhang,
  • Kaiyun Lv

DOI
https://doi.org/10.1109/JSTARS.2023.3296505
Journal volume & issue
Vol. 16
pp. 8215 – 8229

Abstract

Read online

Spectral information and backscatter information are both exclusively important bases for land cover classification, and these two kinds of information are found in multispectral images and SAR images, respectively. Therefore, the fusion of complementary information of multispectral and SAR images can effectively improve land cover classification accuracy. However, the existing fusion methods of multispectral and SAR images generally have some problems, such as insensitivity to edge information, serious interference by speckle noise, and unreasonable settings of fusion rules, which lead to unsatisfactory results of land cover classification. To solve this issue, a fusion method based on adaptive multiscale Gaussian co-occurrence filtering decomposition is proposed. Gaussian filtering and adaptive co-occurrence filtering are applied to the original image to smooth out speckle noise and interference edges within the textures while preserving edge information between textures. Through multi-scale spatial decomposition, the separation of detail information, edge information and basic information is realized, while multi-layer fusion of image features is performed. Finally, the fused image with low noise interference, clear boundary and uniform pixel convergence is generated. Experimental results show that the proposed method generally performs the best in eight evaluation indexes compared with ten other methods. The overall accuracy, average accuracy and Kappa coefficient of land cover classification are increased by 7.674%, 6.776%, and 0.098, respectively, compared with those of the original multispectral image in Area 1, and by 6.904%, 7.649%, and 0.089, respectively, compared with those of the original multispectral image in Area 2.

Keywords