International Journal of Applied Earth Observations and Geoinformation (Sep 2023)

HCPNet: Learning discriminative prototypes for few-shot remote sensing image scene classification

  • Junjie Zhu,
  • Ke Yang,
  • Naiyang Guan,
  • Xiaodong Yi,
  • Chunping Qiu

Journal volume & issue
Vol. 123
p. 103447

Abstract

Read online

Few-shot learning is an important and challenging research topic for remote sensing image scene classification. Many existing approaches address this challenge by using meta-learning and metric-learning techniques, which aim to develop feature extractors that can quickly adapt to new tasks with limited labeled data. However, these methods are unsuitable for real-world datasets that have class confusion, with high inter-class similarity and intra-class diversity. To overcome this limitation, we propose a novel and effective approach to learn query-specific prototype boundaries for few-shot remote sensing scene classification (FS-RSSC). Our approach consists of two key components: (1) a query-specific prototype representation that incorporates the query feature as a key factor in the prototype formation, in contrast to conventional methods that only use the query for model prediction; and (2) a prototypical regularization that enhances the discriminativeness of the prototypes by maximizing their inter-class separation. We employ a contrastive learning framework to model both components of our approach and integrate meta-learning and contrastive learning to learn an optimal query-specific prototype representation initialization that generalizes well to new queries. We name our model the Hybrid Contrastive Prototypical Network (HCPNet). We evaluate the effectiveness of our proposed HCPNet on four popular datasets under two standard benchmarks, namely, general few-shot classification and few-shot domain generalization. Our experimental results demonstrate that our method outperforms the state-of-the-art methods on both benchmarks by a large margin.

Keywords