Entropy (Dec 2022)

Self-Supervised Node Classification with Strategy and Actively Selected Labeled Set

  • Yi Kang,
  • Ke Liu,
  • Zhiyuan Cao,
  • Jiacai Zhang

DOI
https://doi.org/10.3390/e25010030
Journal volume & issue
Vol. 25, no. 1
p. 30

Abstract

Read online

To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes. However, none of the existing self-supervised pretext tasks perform optimally on different datasets, and the choice of hyperparameters is also included when combining self-supervised and supervised tasks. To select the best-performing self-supervised pretext task for each dataset and optimize the hyperparameters with no expert experience needed, we propose a novel auto graph self-supervised learning framework and enhance this framework with a one-shot active learning method. Experimental results on three real world citation datasets show that training GNNs with automatically optimized pretext tasks can achieve or even surpass the classification accuracy obtained with manually designed pretext tasks. On this basis, compared with using randomly selected labeled nodes, using actively selected labeled nodes can further improve the classification performance of GNNs. Both the active selection and the automatic optimization contribute to semi-supervised node classification.

Keywords