EBioMedicine (Sep 2018)

Automated retinopathy of prematurity screening using deep neural networksResearch in Context

  • Jianyong Wang,
  • Rong Ju,
  • Yuanyuan Chen,
  • Lei Zhang,
  • Junjie Hu,
  • Yu Wu,
  • Wentao Dong,
  • Jie Zhong,
  • Zhang Yi

Journal volume & issue
Vol. 35
pp. 361 – 368

Abstract

Read online

Background: Retinopathy of prematurity (ROP) is the leading cause of childhood blindness worldwide. Automated ROP detection system is urgent and it appears to be a safe, reliable, and cost-effective complement to human experts. Methods: An automated ROP detection system called DeepROP was developed by using Deep Neural Networks (DNNs). ROP detection was divided into ROP identification and grading tasks. Two specific DNN models, i.e., Id-Net and Gr-Net, were designed for identification and grading tasks, respectively. To develop the DNNs, large-scale datasets of retinal fundus images were constructed by labeling the images of ROP screenings by clinical ophthalmologists. Findings: On the test dataset, the Id-Net achieved a sensitivity of 96.62%(95%CI, 92.29%–98.89%) and a specificity of 99.32% (95%CI, 96.29%–9.98%) for ROP identification while the Gr-Net attained sensitivity and specificity values of 88.46% (95%CI, 96.29%–99.98%) and 92.31% (95%CI, 81.46%–97.86%), respectively, on the ROP grading task. On another 552 cases, the developed DNNs outperformed some human experts. In a clinical setting, the sensitivity and specificity values of DeepROP for ROP identification were 84.91% (95%CI, 76.65%–91.12%) and 96.90% (95%CI, 95.49%–97.96%), respectively, whereas the corresponding measures for ROP grading were 93.33%(95%CI, 68.05%–99.83%) and 73.63%(95%CI, 68.05%–99.83%), respectively. Interpretation: We constructed large-scale ROP datasets with adequate clinical labels and proposed novel DNN models. The DNN models can directly learn ROP features from big data. The developed DeepROP is potential to be an efficient and effective system for automated ROP screening. Fund: National Natural Science Foundation of China under Grant 61432012 and U1435213.