IEEE Access (Jan 2024)

KDALDL: Knowledge Distillation-Based Adaptive Label Distribution Learning Network for Bone Age Assessment

  • Hao-Dong Zheng,
  • Lei Yu,
  • Yu-Ting Lu,
  • Wei-Hao Zhang,
  • Yan-Jun Yu

DOI
https://doi.org/10.1109/ACCESS.2024.3358821
Journal volume & issue
Vol. 12
pp. 17679 – 17689

Abstract

Read online

Deep learning-based bone age assessment (BAA) approaches have certain drawbacks, such as ignoring the correlation of age labels and simply assuming that bone development is linearly related to bone age, which can affect the accuracy of predictions. To solve these problems, a knowledge distillation-based adaptive label distribution learning method called KDALDL is proposed. The KDALDL framework comprises a teacher model and a student model, both consisting of modules for multi-scale feature extraction, feature refinement, and label distribution learning. First, a multi-scale feature extraction module is designed based on the swin transformer to extract feature information at various scales. Subsequently, these features are fed into the feature refinement module to capture the optimal image features. Then, the discrete labels obtained from the age labels through the Gaussian formula are used to train the teacher model. Finally, the teacher model’s outputs are used to train the student model through the knowledge distillation technique, which enables the student model to achieve improved results by learning from the teacher model. The proposed method is validated using the Radiological Society of North America (RSNA) dataset, which exhibits outstanding results.

Keywords