PLoS ONE (Jan 2023)

Predicting of diabetic retinopathy development stages of fundus images using deep learning based on combined features.

  • Ahlam Shamsan,
  • Ebrahim Mohammed Senan,
  • Hamzeh Salameh Ahmad Shatnawi

DOI
https://doi.org/10.1371/journal.pone.0289555
Journal volume & issue
Vol. 18, no. 10
p. e0289555

Abstract

Read online

The number of diabetic retinopathy (DR) patients is increasing every year, and this causes a public health problem. Therefore, regular diagnosis of diabetes patients is necessary to avoid the progression of DR stages to advanced stages that lead to blindness. Manual diagnosis requires effort and expertise and is prone to errors and differing expert diagnoses. Therefore, artificial intelligence techniques help doctors make a proper diagnosis and resolve different opinions. This study developed three approaches, each with two systems, for early diagnosis of DR disease progression. All colour fundus images have been subjected to image enhancement and increasing contrast ROI through filters. All features extracted by the DenseNet-121 and AlexNet (Dense-121 and Alex) were fed to the Principal Component Analysis (PCA) method to select important features and reduce their dimensions. The first approach is to DR image analysis for early prediction of DR disease progression by Artificial Neural Network (ANN) with selected, low-dimensional features of Dense-121 and Alex models. The second approach is to DR image analysis for early prediction of DR disease progression is by integrating important and low-dimensional features of Dense-121 and Alex models before and after PCA. The third approach is to DR image analysis for early prediction of DR disease progression by ANN with the radiomic features. The radiomic features are a combination of the features of the CNN models (Dense-121 and Alex) separately with the handcrafted features extracted by Discrete Wavelet Transform (DWT), Local Binary Pattern (LBP), Fuzzy colour histogram (FCH), and Gray Level Co-occurrence Matrix (GLCM) methods. With the radiomic features of the Alex model and the handcrafted features, ANN reached a sensitivity of 97.92%, an AUC of 99.56%, an accuracy of 99.1%, a specificity of 99.4% and a precision of 99.06%.