Xi'an Gongcheng Daxue xuebao (Aug 2023)

A lightweight image classification method based on dual-source adaptive knowledge distillation

  • ZHANG Kaibing,
  • MA Dongtong,
  • MENG Yalei

DOI
https://doi.org/10.13338/j.issn.1674-649x.2023.04.011
Journal volume & issue
Vol. 37, no. 4
pp. 82 – 91

Abstract

Read online

In the task of knowledge distillation, a dual-source adaptive knowledge distillation (DSAKD) method is proposed to address the issues of feature information loss during the feature alignment process and the lack of consideration for the differences in samples in the soft label distillation method. The DSAKD method extracts more discriminative knowledge from both the feature layer and soft labels of the teacher network, which enhances the performance of the lightweight student network. An attention-based feature adaptive fusion module was proposed to integrate the intermediate layer features of the teacher network and the student network, and then the feature embedding contrastive distillation strategy was used to optimize the features of the student network. An adaptive temperature distillation strategy was also proposed, which assigned different temperature coefficients to all training samples adaptively based on the prediction confidence of the teacher network. The experimental results show that our proposed method achieves the optimal distillation effect on three benchmark datasets, significantly improving the classification performance of lightweight student networks. Specifically, compared with the best-performing method, the proposed method improves the average top-1 validation accuracy on CIFAR10, CIFAR100, and ImageNet datasets by 0.46%, 0.41%, and 0.59%, respectively.

Keywords