Remote Sensing (Mar 2022)

Deep Internal Learning for Inpainting of Cloud-Affected Regions in Satellite Imagery

  • Mikolaj Czerkawski,
  • Priti Upadhyay,
  • Christopher Davison,
  • Astrid Werkmeister,
  • Javier Cardona,
  • Robert Atkinson,
  • Craig Michie,
  • Ivan Andonovic,
  • Malcolm Macdonald,
  • Christos Tachtatzis

DOI
https://doi.org/10.3390/rs14061342
Journal volume & issue
Vol. 14, no. 6
p. 1342

Abstract

Read online

Cloud cover remains a significant limitation to a broad range of applications relying on optical remote sensing imagery, including crop identification/yield prediction, climate monitoring, and land cover classification. A common approach to cloud removal treats the problem as an inpainting task and imputes optical data in the cloud-affected regions employing either mosaicing historical data or making use of sensing modalities not impacted by cloud obstructions, such as SAR. Recently, deep learning approaches have been explored in these applications; however, the majority of reported solutions rely on external learning practices, i.e., models trained on fixed datasets. Although these models perform well within the context of a particular dataset, a significant risk of spatial and temporal overfitting exists when applied in different locations or at different times. Here, cloud removal was implemented within an internal learning regime through an inpainting technique based on the deep image prior. The approach was evaluated on both a synthetic dataset with an exact ground truth, as well as real samples. The ability to inpaint the cloud-affected regions for varying weather conditions across a whole year with no prior training was demonstrated, and the performance of the approach was characterised.

Keywords