Data Intelligence (Jun 2019)

Learning to Complete Knowledge Graphs with Deep Sequential Models

  • Guo, Lingbing,
  • Zhang, Qingheng,
  • Hu, Wei,
  • Sun, Zequn,
  • Qu, Yuzhong

DOI
https://doi.org/10.1162/dint_a_00016
Journal volume & issue
Vol. 1, no. 3
pp. 289 – 308

Abstract

Read online

Knowledge graph (KG) completion aims at filling the missing facts in a KG, where a fact is typically represented as a triple in the form of ( head, relation, tail). Traditional KG completion methods compel two-thirds of a triple provided (e.g., head and relation) to predict the remaining one. In this paper, we propose a new method that extends multi-layer recurrent neural networks (RNNs) to model triples in a KG as sequences. It obtains state-of-the-art performance on the common entity prediction task, i.e., giving head (or tail) and relation to predict the tail (or the head), using two benchmark data sets. Furthermore, the deep sequential characteristic of our method enables it to predict the relations given head (or tail) only, and even predict the whole triples. Our experiments on these two new KG completion tasks demonstrate that our method achieves superior performance compared with several alternative methods.