Entropy (Aug 2022)

A Block-Based Adaptive Decoupling Framework for Graph Neural Networks

  • Xu Shen,
  • Yuyang Zhang,
  • Yu Xie,
  • Ka-Chun Wong,
  • Chengbin Peng

DOI
https://doi.org/10.3390/e24091190
Journal volume & issue
Vol. 24, no. 9
p. 1190

Abstract

Read online

Graph neural networks (GNNs) with feature propagation have demonstrated their power in handling unstructured data. However, feature propagation is also a smooth process that tends to make all node representations similar as the number of propagation increases. To address this problem, we propose a novel Block-Based Adaptive Decoupling (BBAD) Framework to produce effective deep GNNs by utilizing backbone networks. In this framework, each block contains a shallow GNN with feature propagation and transformation decoupled. We also introduce layer regularizations and flexible receptive fields to automatically adjust the propagation depth and to provide different aggregation hops for each node, respectively. We prove that the traditional coupled GNNs are more likely to suffer from over-smoothing when they become deep. We also demonstrate the diversity of outputs from different blocks of our framework. In the experiments, we conduct semi-supervised and fully supervised node classifications on benchmark datasets, and the results verify that our method can not only improve the performance of various backbone networks, but also is superior to existing deep graph neural networks with less parameters.

Keywords