Remote Sensing (Jan 2023)

D<sup>3</sup>CNNs: Dual Denoiser Driven Convolutional Neural Networks for Mixed Noise Removal in Remotely Sensed Images

  • Zhenghua Huang,
  • Zifan Zhu,
  • Zhicheng Wang,
  • Xi Li,
  • Biyun Xu,
  • Yaozong Zhang,
  • Hao Fang

DOI
https://doi.org/10.3390/rs15020443
Journal volume & issue
Vol. 15, no. 2
p. 443

Abstract

Read online

Mixed (random and stripe) noise will cause serious degradation of optical remotely sensed image quality, making it hard to analyze their contents. In order to remove such noise, various inverse problems are usually constructed with different priors, which can be solved by either model-based optimization methods or discriminative learning methods. However, they have their own drawbacks, such as the former methods are flexible but are time-consuming for the pursuit of good performance; while the later methods are fast but are limited for extensive applications due to their specialized tasks. To fast obtain pleasing results with combination of their merits, in this paper, we propose a novel denoising strategy, namely, Dual Denoiser Driven Convolutional Neural Networks (D3CNNs), to remove both random and stripe noise. The D3CNNs includes the following two key parts: one is that two auxiliary variables respective for the denoised image and the stripe noise are introduced to reformulate the inverse problem as a constrained optimization problem, which can be iteratively solved by employing the alternating direction method of multipliers (ADMM). The other is that the U-shape network is used for the denoised auxiliary variable while the residual CNN (RCNN) for the stripe auxiliary variable. The subjectively and objectively comparable results of experiments on both synthetic and real-world remotely sensed images verify that the proposed method is effective and is even better than the state-of-the-arts.

Keywords