Remote Sensing (Sep 2022)

A Novel Knowledge Distillation Method for Self-Supervised Hyperspectral Image Classification

  • Qiang Chi,
  • Guohua Lv,
  • Guixin Zhao,
  • Xiangjun Dong

DOI
https://doi.org/10.3390/rs14184523
Journal volume & issue
Vol. 14, no. 18
p. 4523

Abstract

Read online

Using deep learning to classify hyperspectral image(HSI) with only a few labeled samples available is a challenge. Recently, the knowledge distillation method based on soft label generation has been used to solve classification problems with a limited number of samples. Unlike normal labels, soft labels are considered the probability of a sample belonging to a certain category, and are therefore more informative for the sake of classification. The existing soft label generation methods for HSI classification cannot fully exploit the information of existing unlabeled samples. To solve this problem, we propose a novel self-supervised learning method with knowledge distillation for HSI classification, termed SSKD. The main motivation is to exploit more valuable information for classification by adaptively generating soft labels for unlabeled samples. First, similarity discrimination is performed using all unlabeled and labeled samples by considering both spatial distance and spectral distance. Then, an adaptive nearest neighbor matching strategy is performed for the generated data. Finally, probabilistic judgment for the category is performed to generate soft labels. Compared to the state-of-the-art method, our method improves the classification accuracy by 4.88%, 7.09% and 4.96% on three publicly available datasets, respectively.

Keywords