IEEE Access (Jan 2021)

Augmenting Few-Shot Learning With Supervised Contrastive Learning

  • Taemin Lee,
  • Sungjoo Yoo

DOI
https://doi.org/10.1109/ACCESS.2021.3074525
Journal volume & issue
Vol. 9
pp. 61466 – 61474

Abstract

Read online

Few-shot learning deals with a small amount of data which incurs insufficient performance with conventional cross-entropy loss. We propose a pretraining approach for few-shot learning scenarios. That is, considering that the feature extractor quality is a critical factor in few-shot learning, we augment the feature extractor using a contrastive learning technique. It is reported that supervised contrastive learning applied to base class training in transductive few-shot training pipeline leads to improved results, outperforming the state-of-the-art methods on Mini-ImageNet and CUB. Furthermore, our experiment shows that a much larger dataset is needed to retain few-shot classification accuracy when domain-shift degradation exists, and if our method is applied, the need for a large dataset is eliminated. The accuracy gain can be translated to a runtime reduction of $3.87\times $ in a resource-constrained environment.

Keywords