IEEE Access (Jan 2020)

SR-HGAT: Symmetric Relations Based Heterogeneous Graph Attention Network

  • Zhenghao Zhang,
  • Jianbin Huang,
  • Qinglin Tan

DOI
https://doi.org/10.1109/ACCESS.2020.3022664
Journal volume & issue
Vol. 8
pp. 165631 – 165645

Abstract

Read online

Graph neural network, as a deep learning based graph representation technology, can capture the structural information encapsulated in graphs well and generate more effective feature embedding. We have recently witnessed an emerging research interests on it. However, existing models are primarily focused on handling homogeneous graphs. When designing graph neural networks for heterogeneous graphs, heterogeneity and rich semantic information bring great challenges. In this paper, we extend graph neural network to heterogeneous graph scenes, and propose a novel high-order Symmetric Relation based Heterogeneous Graph Attention Network, denoted as SR-HGAT, which takes into account the features of nodes and high-order relations simultaneously, and exploits the two-layer attention mechanism based aggregator to efficiently capture essential semantics in an end-to-end manner. The proposed SR-HGAT first identifies the latent semantics underneath the observed explicit symmetric relations guided by different meta-paths and meta-graphs in a heterogeneous graph. The nested propagation mechanism for aggregating semantic and structural features that different links contain is then designed to calculate the interaction strength of each symmetric relation. As the core of the proposed model, to comprehensively capture both the structural and semantic feature information, a two-layer attention mechanism is applied to learn the importance of different neighborhood information as well as the weights of different symmetric relations. These latent semantics are then automatically fused to obtain unified embeddings for specific mining tasks. Extensive experimental results offer insights into the efficacy of the proposed model and have demonstrated that it significantly outperforms state-of-the-art baselines across three benchmark datasets on various downstream tasks.

Keywords