PLoS ONE (Jan 2019)

Accuracy of computer-assisted vertical cup-to-disk ratio grading for glaucoma screening.

  • Blake M Snyder,
  • Sang Min Nam,
  • Preeyanuch Khunsongkiet,
  • Sakarin Ausayakhun,
  • Thidarat Leeungurasatien,
  • Maxwell R Leiter,
  • Artem Sevastopolsky,
  • Ashlin S Joye,
  • Elyse J Berlinberg,
  • Yingna Liu,
  • David A Ramirez,
  • Caitlin A Moe,
  • Somsanguan Ausayakhun,
  • Robert L Stamper,
  • Jeremy D Keenan

DOI
https://doi.org/10.1371/journal.pone.0220362
Journal volume & issue
Vol. 14, no. 8
p. e0220362

Abstract

Read online

PurposeGlaucoma screening can be performed by assessing the vertical-cup-to-disk ratio (VCDR) of the optic nerve head from fundus photography, but VCDR grading is inherently subjective. This study investigated whether computer software could improve the accuracy and repeatability of VCDR assessment.MethodsIn this cross-sectional diagnostic accuracy study, 5 ophthalmologists independently assessed the VCDR from a set of 200 optic disk images, with the median grade used as the reference standard for subsequent analyses. Eight non-ophthalmologists graded each image by two different methods: by visual inspection and with assistance from a custom-made publicly available software program. Agreement with the reference standard grade was assessed for each method by calculating the intraclass correlation coefficient (ICC), and the sensitivity and specificity determined relative to a median ophthalmologist grade of ≥0.7.ResultsVCDR grades ranged from 0.1 to 0.9 for visual assessment and from 0.1 to 1.0 for software-assisted grading, with a median grade of 0.4 for each. Agreement between each of the 8 graders and the reference standard was higher for visual inspection (median ICC 0.65, interquartile range 0.57 to 0.82) than for software-assisted grading (median ICC 0.59, IQR 0.44 to 0.71); P = 0.02, Wilcoxon signed-rank test). Visual inspection and software assistance had similar sensitivity and specificity for detecting glaucomatous cupping.ConclusionThe computer software used in this study did not improve the reproducibility or validity of VCDR grading from fundus photographs compared with simple visual inspection. More clinical experience was correlated with higher agreement with the ophthalmologist VCDR reference standard.