Remote Sensing (Jun 2024)

TDEGAN: A Texture-Detail-Enhanced Dense Generative Adversarial Network for Remote Sensing Image Super-Resolution

  • Mingqiang Guo,
  • Feng Xiong,
  • Baorui Zhao,
  • Ying Huang,
  • Zhong Xie,
  • Liang Wu,
  • Xueye Chen,
  • Jiaming Zhang

DOI
https://doi.org/10.3390/rs16132312
Journal volume & issue
Vol. 16, no. 13
p. 2312

Abstract

Read online

Image super-resolution (SR) technology can improve the resolution of images and provide clearer and more reliable remote sensing images of high quality to better serve the subsequent applications. However, when reconstructing high-frequency feature areas of remote sensing images, existing SR reconstruction methods are prone to artifacts that affect visual effects and make it difficult to generate real texture details. In order to address this issue, a texture-detail-enhanced dense generative adversarial network (TDEGAN) for remote sensing image SR is presented. The generator uses multi-level dense connections, residual connections, and Shuffle attention (SA) to improve the feature extraction ability. A PatchGAN-style discrimination network is designed to effectively perform local discrimination and helps the network generate rich, detailed features. To reduce the impact of artifacts, we introduce an artifact loss function, which is combined with the exponential moving average (EMA) technique to distinguish the artifacts generated from the actual texture details through local statistics, which can help the network reduce artifacts and generate more realistic texture details. Experiments show that TDEGAN can better restore the texture details of remote sensing images and achieves certain advantages in terms of evaluation indicators and visualization.

Keywords