Scientific Reports (Dec 2023)
An inductive knowledge graph embedding via combination of subgraph and type information
Abstract
Abstract Conventional knowledge graph representation learn the representation of entities and relations by projecting triples in the knowledge graph to a continuous vector space. The vector representation increases the precision of link prediction and the efficiency of downstream tasks. However, these methods cannot process previously unseen entities during the knowledge graph evolution. In other words, the model trained on the source knowledge graph cannot be applied to the target knowledge graph containing new unseen entities. Recently, a few subgraph-based link prediction models obtained the inductive ability, but they all neglect semantic information. In this work, we propose an inductive representation learning model TGraiL which considers not only the topological structure but also semantic information. First, distance in the subgraph is used to encode the node’s topological structure. Second, the projection matrix is used to encode the entity type information. Finally, both kinds of information are fused for training to acquire the ultimate vector representation of entities. The experimental results indicate that the model’s performance has been significantly improved compared to the existing baseline models, demonstrating the method’s effectiveness and superiority.