Remote Sensing (Nov 2022)

Remote Sensing Image Super-Resolution via Residual-Dense Hybrid Attention Network

  • Bo Yu,
  • Bin Lei,
  • Jiayi Guo,
  • Jiande Sun,
  • Shengtao Li,
  • Guangshuai Xie

DOI
https://doi.org/10.3390/rs14225780
Journal volume & issue
Vol. 14, no. 22
p. 5780

Abstract

Read online

Nowadays, remote sensing datasets with long temporal coverage generally have a limited spatial resolution, most of the existing research uses the single image super-resolution (SISR) method to reconstruct high-resolution (HR) images. However, due to the lack of information in low-resolution (LR) images and the ill-posed nature of SISR, it is difficult to reconstruct the fine texture of HR images under large-scale magnification factors (e.g., four times). To address this problem, we propose a new reference-based super-resolution method called a Residual-Dense Hybrid Attention Network (R-DHAN), which uses the rich texture information in the reference image to make up for the deficiency of the original LR image. The proposed SR model employs Super-Resolution by Neural Texture Transfer (SRNTT) as a backbone. Based on this structure, we propose a dense hybrid attention block (DHAB) as a building block of R-DHAN. The DHAB fuses the input and its internal features of current block. While making full use of the feature information, it uses the interdependence between different channels and different spatial dimensions to model and obtains a strong representation ability. In addition, a hybrid channel-spatial attention mechanism is introduced to focus on important and useful regions to better reconstruct the final image. Experiments show that compared with SRNTT and some classical SR techniques, the proposed R-DHAN method performs well in quantitative evaluation and visual quality.

Keywords