Discover Computing (Jun 2025)
Enhancing medical diagnosis on chest X-rays: knowledge distillation from self-supervised based model to compressed student model
Abstract
Abstract Deep learning and self-supervised learning techniques have advanced, making it possible to diagnose medical images more accurately. Our goal in this work is to increase the accuracy of medical diagnosis using chest X-rays by utilising information distillation and model compression techniques. By extracting knowledge from a self-supervised model which is a teacher model (SWAV Model with ResNet-50 as backbone), we enhance inference speed and reduce the computational resources needed for correct diagnosis. Our strategy entails translating the knowledge acquired by self-supervised models to smaller, more efficient models without sacrificing accuracy. In addition, we look at how model compression and distillation affect the diagnosis’s interpretation. The results of this study may enhance medical diagnosis procedures and increase their accessibility in environments with limited resources. Our extensive experiments prove the efficacy with 97.34% accuracy for the student model with knowledge distillation.
Keywords