IEEE Access (Jan 2020)

Knowledge Transfer for Out-of-Knowledge-Base Entities: Improving Graph-Neural-Network-Based Embedding Using Convolutional Layers

  • Zhongqin Bi,
  • Tianchen Zhang,
  • Ping Zhou,
  • Yongbin Li

DOI
https://doi.org/10.1109/ACCESS.2020.3019592
Journal volume & issue
Vol. 8
pp. 159039 – 159049

Abstract

Read online

Knowledge base completion (KBC) aims to predict missing information in a knowledge base. Most existing embedding-based KBC models assume that all test entities are available at training time. Thus, a question arises-that is, how to answer queries concerning test entities not observed at training time, which is called the out-of-knowledge-base (OOKB) entity problem. In this article, we propose a parameter-efficient embedding model that combines the benefits of a graph neural network (GNN) and a convolutional neural network (CNN) to solve the KBC task with OOKB entities. First, we apply the GNN architecture to learn the information between nodes in the graph. Second, convolution layers are used as a transition matrix in GNN to learn more expressive embeddings with fewer parameters. Finally, we use a transition-based knowledge graph embedding model to solve the KBC task. The model has learnable weights that adapt based on information from neighbors and can exploit auxiliary knowledge for OOKB entities to compute their embedding while remaining parameter efficient. We demonstrate the effectiveness of the proposed model on OOKB datasets, and the code is available at https://github.com/Tianchen627/Knowledge-Transfer-for-Out-of-Knowledge-Base-Entities.

Keywords