ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (Dec 2023)

OPTICAL AND SAR IMAGE FUSION BASED ON VISUAL SALIENCY FEATURES

  • J. Zhang,
  • X. Ren,
  • J. Li,
  • L. Wang,
  • Y. Ye

DOI
https://doi.org/10.5194/isprs-annals-X-1-W1-2023-747-2023
Journal volume & issue
Vol. X-1-W1-2023
pp. 747 – 754

Abstract

Read online

With the expansion of optical and SAR image fusion application scenarios, it is necessary to integrate their information in land classification, feature recognition, and target tracking. Current methods focus excessively on integrating multimodal feature information to enhance the information richness of the fused images, while neglecting the highly corrupted visual perception of the fused results by modal differences and SAR speckle noise. To address this problem, in this paper we propose a novel optical and SAR image fusion framework named Visual Saliency Features Fusion (VSFF). We improved the decomposition algorithm of complementary feature to reduce most of the speckle noise in the initial features, and divide the image into main structure features and detail texture features. For the fusion of main structure features, we reconstruct a visual saliency features map that contains significant information from optical and SAR images, and input it together with the optical image into a total variation constraint model to compute the fusion result and achieve the optimal information transfer. Meanwhile, we construct a new feature descriptor based on Gabor wavelet, which separates meaningful detail texture features from residual noise and selectively preserves features that can improve the interpretability of fusion result. Further a fast IHS transform fusion is used to supplement the fused image with realistic color information. In a comparative analysis with five state-of-the-art fusion algorithms, VSFF achieved better results in qualitative and quantitative evaluations, and our fused images have a clear and appropriate visual perception.