Applied Sciences (Sep 2024)

Robust Text-to-Cypher Using Combination of BERT, GraphSAGE, and Transformer (CoBGT) Model

  • Quoc-Bao-Huy Tran,
  • Aagha Abdul Waheed,
  • Sun-Tae Chung

DOI
https://doi.org/10.3390/app14177881
Journal volume & issue
Vol. 14, no. 17
p. 7881

Abstract

Read online

Graph databases have become essential for managing and analyzing complex data relationships, with Neo4j emerging as a leading player in this domain. Neo4j, a high-performance NoSQL graph database, excels in efficiently handling connected data, offering powerful querying capabilities through its Cypher query language. However, due to Cypher’s complexities, making it more accessible for nonexpert users requires translating natural language queries into Cypher. Thus, in this paper, we propose a text-to-Cypher model to effectively translate natural language queries into Cypher. In our proposed model, we combine several methods to enable nonexpert users to interact with graph databases using the English language. Our approach includes three modules: key-value extraction, relation–properties prediction, and Cypher query generation. For key-value extraction and relation–properties prediction, we leverage BERT and GraphSAGE to extract features from natural language. Finally, we use a Transformer model to generate the Cypher query from these features. Additionally, due to the lack of text-to-Cypher datasets, we introduced a new dataset that contains English questions querying information within a graph database, paired with corresponding Cypher query ground truths. This dataset aids future model learning, validation, and comparison on text-to-Cypher task. Through experiments and evaluations, we demonstrate that our model achieves high accuracy and efficiency when comparing with some well-known seq2seq model such as T5 and GPT2, with an 87.1% exact match score on the dataset.

Keywords