IET Image Processing (Jun 2021)

A plexus‐convolutional neural network framework for fast remote sensing image super‐resolution in wavelet domain

  • Farah Deeba,
  • Yuanchun Zhou,
  • Fayaz Ali Dharejo,
  • Muhammad Ashfaq Khan,
  • Bhagwan Das,
  • Xuezhi Wang,
  • Yi Du

DOI
https://doi.org/10.1049/ipr2.12136
Journal volume & issue
Vol. 15, no. 8
pp. 1679 – 1687

Abstract

Read online

Abstract Satellite image processing has been widely used in recent years in a number of applications such as land classification, Identification transfer, resource exploration, super‐resolution image, etc. Due to the orbital location, revision time, quick view angle limitations, and weather impact, the satellite images are challenging to manage. There are many types of resolution, such as spatial, spectral, and temporal. Still, in our case, we concentrated on spatial image resolution to super resolve the images from low‐resolution images. For remote sensing image super‐resolution fast wavelet‐based super‐resolution (FWSR), we propose a novel, fast wavelet‐based plexus framework that performs super‐resolution convolutional neural network (SRCNN)‐like extraction of features based on three hidden layers. First, wavelet sub‐band images are combined into a pre‐defined full‐scale data training factor, including approximation and interchangeable stand‐alone units (frequency sub‐bands). Second, to speed up image recovery, mapping the sub‐band image of the wavelet is then measured using its approximate image. Third, the added sub‐pixel layer at the end of the network model is intended to reproduce image quality using a plexus framework. The approximation sub‐band images obtained after discrete wavelet transform wavelet decomposition are used as input rather than the original image because of their high‐frequency data and preserved characteristics. Five current super‐resolution neural network approaches are compared with the proposed technique and tested on three pubic satellite image datasets and two benchmark datasets. The experimental findings are well compared qualitatively and quantitatively.

Keywords