Uro (Jun 2024)

Revolutionizing Prostate Whole-Slide Image Super-Resolution: A Comparative Journey from Regression to Generative Adversarial Networks

  • Anil B. Gavade,
  • Kartik A. Gadad,
  • Priyanka A. Gavade,
  • Rajendra B. Nerli,
  • Neel Kanwal

DOI
https://doi.org/10.3390/uro4030007
Journal volume & issue
Vol. 4, no. 3
pp. 89 – 103

Abstract

Read online

Microscopic and digital whole-slide images (WSIs) often suffer from limited spatial resolution, hindering accurate pathological analysis and cancer diagnosis. Improving the spatial resolution of these pathology images is crucial, as it can enhance the visualization of fine cellular and tissue structures, leading to more reliable and precise cancer detection and diagnosis. This paper presents a comprehensive comparative study on super-resolution (SR) reconstruction techniques for prostate WSI, exploring a range of machine learning, deep learning, and generative adversarial network (GAN) algorithms. The algorithms investigated include regression, sparse learning, principal component analysis, bicubic interpolation, multi-support vector neural networks, an SR convolutional neural network, and an autoencoder, along with advanced SRGAN-based methods. The performance of these algorithms was meticulously evaluated using a suite of metrics, such as the peak signal-to-noise ratio (PSNR), structural similarity index metrics (SSIMs), root-mean-squared error, mean absolute error and mean structural similarity index metrics (MSSIMs). The comprehensive study was conducted on the SICAPv2 prostate WSI dataset. The results demonstrated that the SRGAN algorithm outperformed other algorithms by achieving the highest PSNR value of 26.47, an SSIM of 0.85, and an MSSIM of 0.92, by 4× magnification of the input LR image, preserving the image quality and fine details. Therefore, the application of SRGAN offers a budget-friendly counter to the high-cost challenge of acquiring high-resolution pathology images, enhancing cancer diagnosis accuracy.

Keywords