IEEE Access (Jan 2023)

TAKDSR: Teacher Assistant Knowledge Distillation Framework for Graphics Image Super-Resolution

  • Min Yoon,
  • Seunghyun Lee,
  • Byung Cheol Song

DOI
https://doi.org/10.1109/ACCESS.2023.3323273
Journal volume & issue
Vol. 11
pp. 112015 – 112026

Abstract

Read online

This paper presents a framework for effectively applying knowledge distillation (KD) to super-resolution (SR) tasks for computer graphics (CG) images. Specifically, we propose TAKDSR, a KD framework for SR using a teacher assistant (TA) network. Recently, the performance of SR models has improved dramatically thanks to the development of deep learning. SR models have evolved into a form that requires a considerable amount of computation and parameters while adopting a complex neural network structure to improve performance. However, it is difficult to utilize conventional high-performance SR models for real-time up-scaling in CG applications requiring high resolution and high frame rate. To solve this, we employ an approach that applies KD to a lightweight SR model. At this time, if the high-resolution (HR) image is used as input for the teacher to show superior performance to the student, a large performance difference occurs between the two due to the excessive performance of the teacher. As a result, the teacher’s knowledge has a significantly hard and complex nature, and when transferred to the student, the effect of KD can be rather weakened. Therefore, we adopt a TA network to facilitate the propagation of knowledge between teacher and student. At the same time, the distribution of compact features (CF), which are the decoder input of the teacher, is discretized so that it is compatible with the input distribution of the student, enabling effective KD. Experimental results demonstrate the proposed TAKDSR significantly improves the performance of a given SR model on CG image datasets.

Keywords