Scientific Reports (Jun 2023)

Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation

  • Daniel Schaudt,
  • Reinhold von Schwerin,
  • Alexander Hafner,
  • Pascal Riedel,
  • Christian Späte,
  • Manfred Reichert,
  • Andreas Hinteregger,
  • Meinrad Beer,
  • Christopher Kloth

DOI
https://doi.org/10.1038/s41598-023-36148-7
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 13

Abstract

Read online

Abstract In medical imaging, deep learning models can be a critical tool to shorten time-to-diagnosis and support specialized medical staff in clinical decision making. The successful training of deep learning models usually requires large amounts of quality data, which are often not available in many medical imaging tasks. In this work we train a deep learning model on university hospital chest X-ray data, containing 1082 images. The data was reviewed, differentiated into 4 causes for pneumonia, and annotated by an expert radiologist. To successfully train a model on this small amount of complex image data, we propose a special knowledge distillation process, which we call Human Knowledge Distillation. This process enables deep learning models to utilize annotated regions in the images during the training process. This form of guidance by a human expert improves model convergence and performance. We evaluate the proposed process on our study data for multiple types of models, all of which show improved results. The best model of this study, called PneuKnowNet, shows an improvement of + 2.3% points in overall accuracy compared to a baseline model and also leads to more meaningful decision regions. Utilizing this implicit data quality-quantity trade-off can be a promising approach for many scarce data domains beyond medical imaging.