IEEE Access (Jan 2024)

TransKGQA: Enhanced Knowledge Graph Question Answering With Sentence Transformers

  • You Li Chong,
  • Chin Poo Lee,
  • Shahrin Zen Muhd-Yassin,
  • Kian Ming Lim,
  • Ahmad Kamsani Samingan

DOI
https://doi.org/10.1109/ACCESS.2024.3405583
Journal volume & issue
Vol. 12
pp. 74872 – 74887

Abstract

Read online

Knowledge Graph Question Answering (KGQA) plays a crucial role in extracting valuable insights from interconnected information. Existing methods, while commendable, face challenges such as contextual ambiguity and limited adaptability to diverse knowledge domains. This paper introduces TransKGQA, a novel approach addressing these challenges. Leveraging Sentence Transformers, TransKGQA enhances contextual understanding, making it adaptable to various knowledge domains. The model employs question-answer pair augmentation for robustness and introduces a threshold mechanism for reliable answer retrieval. TransKGQA overcomes limitations in existing works by offering a versatile solution for diverse question types. Experimental results, notably with the sentence-transformers/all-MiniLM-L12-v2 model, showcase remarkable performance with an F1 score of 78%. This work advances KGQA systems, contributing to knowledge graph construction, enhanced question answering, and automated Cypher query execution.

Keywords