International Journal of Applied Earth Observations and Geoinformation (May 2024)

MCDNet: Multilevel cloud detection network for remote sensing images based on dual-perspective change-guided and multi-scale feature fusion

  • Junwu Dong,
  • Yanhui Wang,
  • Yang Yang,
  • Mengqin Yang,
  • Jun Chen

Journal volume & issue
Vol. 129
p. 103820

Abstract

Read online

Cloud detection plays a crucial role in the preprocessing of optical remote sensing images. While extensive deep learning-based methods have shown strong performance in detecting thick clouds, their ability to identify thin and broken clouds is often inadequate due to their sparse distribution, semi-transparency, and similarity to background regions. To address this limitation, we introduce a multilevel cloud detection network (MCDNet) capable of simultaneously detecting thick and thin clouds. This network effectively enhances the accuracy of identifying thin and broken clouds by integrating a dual-perspective change-guided mechanism (DPCG) and a multi-scale feature fusion module (MSFF). The DPCG creates a dual-input stream by combining the original image with the thin cloud removal image, and then utilizes a dual-perspective feature fusion module (DPFF) to perform feature fusion and extract change features, thereby improving the model's ability to perceive thin cloud regions and mitigate inter-class similarity in multilevel cloud detection. The MSFF enhances the model's sensitivity to broken clouds by utilizing multiple non-adjacent low-level features to remedy the missing spatial information in the high-level features during multiple downsampling. Experimental results on the L8-Biome and WHUS2-CD datasets demonstrate that MCDNet significantly enhances the detection performance of both thin and broken clouds, and outperforms state-of-the-art methods in accuracy and efficiency. The code of MCDNet is available in https://github.com/djw-easy/MCDNet.

Keywords