IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing (Jan 2024)
CERMF-Net: A SAR-Optical Feature Fusion for Cloud Elimination From Sentinel-2 Imagery Using Residual Multiscale Dilated Network
Abstract
Satellite-based Earth observation activities, such as urban and agricultural land monitoring, change detection, and disaster management, are constrained by adequate spatial and temporal ground observations. The presence of aerosols and clouds usually distorts quality ground optical observations and reduces the temporal resolution, which degrades the learning and extraction of valuable information. The uncertainty in the occurrence of clouds in the Earth's atmosphere and the possible land changes in subsequent temporal visits is the major challenge in cloud-free reconstruction problems. Advancements in deep learning enabled learning from multisensory inputs, and cloud removal problem seek helps from auxiliary information for better reconstruction. This research introduces a synthetic aperture radar (SAR) guided feature Fusion for Cloud Elimination from Sentinel-2 multispectral imagery using Residual Multiscale dilated Network (CERMF-Net). The proposed CERMF-Net fuses SAR with Sentinel-2 optical data and learn spatial–temporal dependencies and physical–geometrical properties for effective cloud removal. The generalizability and robustness of CERMF-Net are tested against the SEN12MS-CR dataset, a global real cloud-removal dataset. The CERMF-Net displayed superior performance in comparison with the state-of-the-art techniques.
Keywords