IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2023)
An Effective Multimodel Fusion Method for SAR and Optical Remote Sensing Images
Abstract
Remote sensing images acquired by different sensors exhibit different characteristics due to their distinct imaging mechanisms. The fusion of Synthetic Aperture Radar (SAR) and optical remote sensing images is valuable for specific remote sensing image applications, as it enables the extraction of texture features from SAR images while preserving the spectral information of optical images. Several existing fusion approaches have been proposed in recent years, including the Nonsubsampled Shearlet Transform Pulse Coupled Neural Network (NSST-PCNN), which is a typical and effective fusion method. However, it suffers from the inconsistency in regional edge information. To address this issue, we propose a new method called MS-NSST-PCNN for multi-model fusion of SAR and optical remote sensing images. This method incorporates the multiScale morphological gradient (MSMG) into NSST-PCNN to detect edges and enhance the utilization of edge characteristics. The fusion results of two polarization modes, VV and VH are evaluated in combination with existing methods, using image fusion accuracy and visual interpretation criteria. The results demonstrate that for Sentinel 1 and Landsat 8 OLI image fusion the proposed MS-NSST-PCNN method achieves higher correlation coefficients and lower spectral distortion with VV polarization compared to traditional methods in two study areas. Moreover, the proposed method also exhibits better performance for GF3 and GF2 images with higher spatial resolution. In subsequent applications of land classification in urban and rural scenarios, the fusion results of the proposed method achieve higher accuracy than those of other fusion methods or source images applied directly.
Keywords