Scientific Reports (Apr 2023)

An interpretable and interactive deep learning algorithm for a clinically applicable retinal fundus diagnosis system by modelling finding-disease relationship

  • Jaemin Son,
  • Joo Young Shin,
  • Seo Taek Kong,
  • Jeonghyuk Park,
  • Gitaek Kwon,
  • Hoon Dong Kim,
  • Kyu Hyung Park,
  • Kyu-Hwan Jung,
  • Sang Jun Park

DOI
https://doi.org/10.1038/s41598-023-32518-3
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 13

Abstract

Read online

Abstract The identification of abnormal findings manifested in retinal fundus images and diagnosis of ophthalmic diseases are essential to the management of potentially vision-threatening eye conditions. Recently, deep learning-based computer-aided diagnosis systems (CADs) have demonstrated their potential to reduce reading time and discrepancy amongst readers. However, the obscure reasoning of deep neural networks (DNNs) has been the leading cause to reluctance in its clinical use as CAD systems. Here, we present a novel architectural and algorithmic design of DNNs to comprehensively identify 15 abnormal retinal findings and diagnose 8 major ophthalmic diseases from macula-centered fundus images with the accuracy comparable to experts. We then define a notion of counterfactual attribution ratio (CAR) which luminates the system’s diagnostic reasoning, representing how each abnormal finding contributed to its diagnostic prediction. By using CAR, we show that both quantitative and qualitative interpretation and interactive adjustment of the CAD result can be achieved. A comparison of the model’s CAR with experts’ finding-disease diagnosis correlation confirms that the proposed model identifies the relationship between findings and diseases similarly as ophthalmologists do.