IEEE Access (Jan 2022)

Calibration of Few-Shot Classification Tasks: Mitigating Misconfidence From Distribution Mismatch

  • Sungnyun Kim,
  • Se-Young Yun

DOI
https://doi.org/10.1109/ACCESS.2022.3176090
Journal volume & issue
Vol. 10
pp. 53894 – 53908

Abstract

Read online

As many meta-learning algorithms improve performance in solving few-shot classification problems for practical applications, the accurate prediction of uncertainty is considered essential. In meta-training, the algorithm treats all generated tasks equally and updates the model to perform well on training tasks. During the training, some of the tasks may make it difficult for the model to infer the query examples from the support examples, especially when a large mismatch between the support set and the query set exists. The distribution mismatch causes the model to have incorrect confidence, which causes a calibration problem. In this study, we propose a novel meta-training method that measures the distribution mismatch and enables the model to predict with more precise confidence. Moreover, our method is algorithm-agnostic and can be readily expanded to include a range of meta-learning models. Through extensive experimentation, including dataset shift, we show that our training strategy prevents the model from becoming indiscriminately confident, and thereby helps the model to produce calibrated classification results without the loss of accuracy.

Keywords