IEEE Access (Jan 2019)

Natural Answer Generation With Attention Over Instances

  • Mengxi Wei,
  • Yang Zhang

DOI
https://doi.org/10.1109/ACCESS.2019.2904337
Journal volume & issue
Vol. 7
pp. 61008 – 61017

Abstract

Read online

Natural answer generation (NAG) is more and more popular in real-world knowledge base question answering (KBQA) systems for being able to automatically generate natural language answers with structured KB. Large-scale community QA-pairs crawled from the Internet could be directly used to train NAG models. However, it is pervasive in these datasets that one question may contain multiple answers of varied quality, and NAG models suffer from the simple principle of equal treatment of these answers. To address this problem, we propose two kinds of attention-based algorithms to handle all answers to a question at a time. Selective attention and self-attention mechanisms are used to dynamically weight the answers to one question during the training process. Specifically, selective attention methods weight the answers using the relationships between all the KB objects the question needs and the generated answers, and self-attention methods weight them according to the generating difficulty. The experiments on the public open-domain community QA dataset demonstrate that Selective-ATT outperforms the state-of-the-art by 10.53% in the entity accuracy, 9.34% in the BLEU score, and 1.19% in the Rouge score.

Keywords