Applied Sciences (Mar 2022)

Generative Model Using Knowledge Graph for Document-Grounded Conversations

  • Boeun Kim,
  • Dohaeng Lee,
  • Damrin Kim,
  • Hongjin Kim,
  • Sihyung Kim,
  • Ohwoog Kwon,
  • Harksoo Kim

DOI
https://doi.org/10.3390/app12073367
Journal volume & issue
Vol. 12, no. 7
p. 3367

Abstract

Read online

Document-grounded conversation (DGC) is a natural language generation task to generate fluent and informative responses by leveraging dialogue history and document(s). Recently, DGCs have focused on fine-tuning using pretrained language models. However, these approaches have a problem in that they must leverage the background knowledge under capacity constraints. For example, the maximum length of the input is limited to 512 or 1024 tokens. This problem is fatal in DGC because most documents are longer than the maximum input length. To address this problem, we propose a document-grounded generative model using a knowledge graph. The proposed model converts knowledge sentences extracted from the given document(s) into knowledge graphs and fine-tunes the pretrained model using the graph. We validated the effectiveness of the proposed model using a comparative experiment on the well-known Wizard-of-Wikipedia dataset. The proposed model outperformed the previous state-of-the-art model in our experiments on the Doc2dial dataset.

Keywords