IEEE Access (Jan 2024)

Leveraging Non-Parametric Reasoning With Large Language Models for Enhanced Knowledge Graph Completion

  • Ying Zhang,
  • Yangpeng Shen,
  • Gang Xiao,
  • Jinghui Peng

DOI
https://doi.org/10.1109/ACCESS.2024.3505433
Journal volume & issue
Vol. 12
pp. 177012 – 177027

Abstract

Read online

The completeness of knowledge graphs is critical to their effectiveness across various applications. However, existing knowledge graph completion methods face challenges such as difficulty in adapting to new entity information, parameter explosion, and limited generalization capability. To address these issues, this paper proposes a knowledge graph completion framework that integrates large language models with case-based reasoning (CBR-LLM). By combining non-parametric reasoning with the semantic understanding capabilities of large language models, the framework not only improves completion accuracy but also significantly enhances generalization under various data-missing scenarios. Experimental results demonstrate that CBR-LLM excels in handling complex reasoning tasks and large-scale data-missing scenarios, providing an efficient and scalable solution for knowledge graph completion.

Keywords