IEEE Access (Jan 2022)

Neural Network With Hierarchical Attention Mechanism for Contextual Topic Dialogue Generation

  • Xiao Sun,
  • Bingbing Ding

DOI
https://doi.org/10.1109/ACCESS.2022.3140820
Journal volume & issue
Vol. 10
pp. 4628 – 4639

Abstract

Read online

The encoder-decoder model has achieved remarkable results in natural language generation. However, in the dialogue generation work, we often ignore the influence of the dialogue context information and topic information in the generation, resulting in the generated replies not close to the context or lack of topic information leads to general responses. In this work, we study the generation of multi-turn dialogues based on a large corpus and take advantage of the context information and topic information of the conversation in the process of dialogue generation to generate more coherent context-sensitive responses. We improve upon existing models and attention mechanisms and propose a new hierarchical model to better solve the problem of dialogue context (the HAT model). This method enables the model to obtain more contextual information when processing and improves the ability of the model in terms of contextual relevance to produce high-quality responses. In addition, to address the absence of topics in the responses, we pre-train the LDA(Latent Dirichlet Allocation) topic model to extract topic words of the dialogue content and retain as much topic information of dialogue as possible. Our model is extensively tested in several corpora, and the experiments illustrate that our model is superior to most hierarchical and non-hierarchical models with respect to multiple evaluation metrics.

Keywords