Mathematics (Jul 2024)

Detect-Then-Resolve: Enhancing Knowledge Graph Conflict Resolution with Large Language Model

  • Huang Peng,
  • Pengfei Zhang,
  • Jiuyang Tang,
  • Hao Xu,
  • Weixin Zeng

DOI
https://doi.org/10.3390/math12152318
Journal volume & issue
Vol. 12, no. 15
p. 2318

Abstract

Read online

Conflict resolution for knowledge graphs (KGs) is a critical technique in knowledge fusion, ensuring the resolution of conflicts between existing KGs and external knowledge while maintaining post-fusion accuracy. However, current approaches often encounter difficulties with external triples involving unseen entities due to limited knowledge. Moreover, current methodologies typically overlook conflict detection prior to resolution, a crucial step for accurate truth inference. This paper introduces CRDL, an innovative approach that leverages conflict detection and large language models (LLMs) to identify truths. By employing conflict detection, we implement precise filtering strategies tailored to various types of relations and attributes. By designing prompts and injecting relevant information into an LLM, we identify triples with unseen entities. Experimental results demonstrate the superiority of CRDL over baseline methods. Specifically, our method surpasses the state-of-the-art by achieving a 56.4% improvement in recall and a 68.2% increase in F1-score. These results clearly illustrate the enhanced performance and effectiveness of our approach. Additionally, ablation studies and further analyses underscore the importance of the components within CRDL.

Keywords