IEEE Access (Jan 2024)

CC-GNN: A Clustering Contrastive Learning Network for Graph Semi-Supervised Learning

  • Peng Qin,
  • Weifu Chen,
  • Min Zhang,
  • Defang Li,
  • Guocan Feng

DOI
https://doi.org/10.1109/ACCESS.2024.3398356
Journal volume & issue
Vol. 12
pp. 71956 – 71969

Abstract

Read online

In graph modeling, scarcity of labeled data is a challenging issue. To address this issue, state-of-the-art graph models learn the representation of graph data via contrastive learning. Those models usually use data augmentation techniques to generate positive pairs for contrastive learning, which aims to maximize the similarity of positive data pairs while minimizing the similarity of negative data pairs. However, samples with the same labels may be separately mapped in the feature space. To solve this problem, we introduce a novel model called Clustering Contrastive Graph Neural Network (CC-GNN), which develops a new kind of grouped contrastive learning that maximizes the similarity of positive data groups and minimizes the similarity of negative groups. That is, contrastive learning is defined on a group level rather than on an instant level. We assert that parameters learned by this kind of contrastive learning will lead to better performance of graph neural networks for downstream classification tasks. We combined the clustering contrastive learning technique with three baseline GNN models for graph classification. We found that the performance of these models was significantly improved, which strongly supports our assertion. We also testified the models for node classification on three popular citation networks. Finally, we conducted an ablation study to analyze how the clustering contrastive learning influence the performance of a graph model.

Keywords