Electronics Letters (Aug 2023)

Class relationship‐based knowledge distillation for efficient human parsing

  • Yuqi Lang,
  • Kunliang Liu,
  • Jianming Wang,
  • Wonjun Hwang

DOI
https://doi.org/10.1049/ell2.12900
Journal volume & issue
Vol. 59, no. 15
pp. n/a – n/a

Abstract

Read online

Abstract In computer vision, human parsing is challenging due to its demand for accurate human region location and semantic partitioning. This dense prediction task needs powerful computation and high‐precision models. To enable real‐time parsing on resource‐limited devices, the authors introduced a lightweight model using ResNet18 as a core network . The authors simplified the pyramid module, improving context clarity and reducing complexity. The authors integrated a spatial attention fusion strategy to counter precision loss in the light‐weighting process. Traditional models, despite their segmentation precision, are limited by their computational complexity and extensive parameters. The authors implemented knowledge distillation (KD) techniques to enhance the authors’ lightweight network's accuracy. Traditional methods can fail to learn useful knowledge with significant network differences. Hence, the authors used a novel distillation approach based on inter‐class and intra‐class relations in prediction outcomes, noticeably improving parsing accuracy. The authors’ experiments on the Look into Person (LIP) dataset show that their lightweight model significantly reduces parameters while maintaining parsing precision and enhancing inference speed.

Keywords