IEEE Access (Jan 2024)

Heterogeneous Knowledge Distillation for Anomaly Detection

  • Longjiang Wu,
  • Jiali Zhou

DOI
https://doi.org/10.1109/ACCESS.2024.3415503
Journal volume & issue
Vol. 12
pp. 161490 – 161499

Abstract

Read online

Anomaly detection remains a formidable challenge due to the inherent rarity and unpredictability of anomalous instances. Traditional knowledge distillation approaches, leveraging the comparison of output features from student-teacher (S-T) networks, often grapple with high false positive rates due to the phenomenon of overgeneralization. Addressing these limitations, our study introduces a novel approach, dubbed HKD-AD, with heterogeneity in both data and the structural composition of the S-T networks. Specifically, we fed paired normal and anomalous images to the S-T networks, with the student network aimed at mimicking normal teacher features from the simulated abnormal images. We also employ the Transform-based teacher to guide the CNN-based student network, which not only amplifies feature extraction capabilities but also substantially enhances anomaly localization. Experiments conducted on the challenging MVTec AD dataset demonstrate that our HKD-AD significantly outperforms existing methods, achieving state-of-the-art results with high efficiency.

Keywords