IET Computer Vision (Oct 2024)

Knowledge distillation of face recognition via attention cosine similarity review

  • Zhuo Wang,
  • SuWen Zhao,
  • WanYi Guo

DOI
https://doi.org/10.1049/cvi2.12288
Journal volume & issue
Vol. 18, no. 7
pp. 875 – 887

Abstract

Read online

Abstract Deep learning‐based face recognition models have demonstrated remarkable performance in benchmark tests, and knowledge distillation technology has been frequently accustomed to obtain high‐precision real‐time face recognition models specifically designed for mobile and embedded devices. However, in recent years, the knowledge distillation methods for face recognition, which mainly focus on feature or logit knowledge distillation techniques, neglect the attention mechanism that play an important role in the domain of neural networks. An innovation cross‐stage connection review path of the attention cosine similarity knowledge distillation method that unites the attention mechanism with review knowledge distillation method is proposed. This method transfers the attention map obtained from the teacher network to the student through a cross‐stage connection path. The efficacy and excellence of the proposed algorithm are demonstrated in popular benchmark tests.

Keywords