Applied Sciences (Dec 2023)

Few-Shot Image Classification via Mutual Distillation

  • Tianshu Zhang,
  • Wenwen Dai,
  • Zhiyu Chen,
  • Sai Yang,
  • Fan Liu,
  • Hao Zheng

DOI
https://doi.org/10.3390/app132413284
Journal volume & issue
Vol. 13, no. 24
p. 13284

Abstract

Read online

Due to their compelling performance and appealing simplicity, metric-based meta-learning approaches are gaining increasing attention for addressing the challenges of few-shot image classification. However, many similar methods employ intricate network architectures, which can potentially lead to overfitting when trained with limited samples. To tackle this concern, we propose using mutual distillation to enhance metric-based meta-learning, effectively bolstering model generalization. Specifically, our approach involves two individual metric-based networks, such as prototypical networks and relational networks, mutually supplying each other with a regularization term. This method seamlessly integrates with any metric-based meta-learning approach. We undertake comprehensive experiments on two prevalent few-shot classification benchmarks, namely miniImageNet and Caltech-UCSD Birds-200-2011 (CUB), to demonstrate the effectiveness of our proposed algorithm. The results demonstrate that our method efficiently enhances each metric-based model through mutual distillation.

Keywords