IEEE Access (Jan 2019)

Natural Language Generation Using Dependency Tree Decoding for Spoken Dialog Systems

  • Youngmin Park,
  • Sangwoo Kang

DOI
https://doi.org/10.1109/ACCESS.2018.2889556
Journal volume & issue
Vol. 7
pp. 7250 – 7258

Abstract

Read online

In this paper, we propose a new natural language generation (NLG) method for spoken dialog systems and demonstrate its capacity. Studies on NLG often employ sequence decoding, which generates the words comprising a sentence in sequential order and uses the input generated by each word in the previous step. In contrast, we propose a decoding method that employs a sequence generated by traversing a dependency tree with feed input to a pair consisting of a parent and sibling in the dependency tree. As a result, the most important words are generated first, thereby enabling words with greater relevance to be fed into the process. At prediction time, our model generates dependency trees and converts the trees into sentences. The proposed decoding method was evaluated by re-implementing a semantically controlled long short-term memory structure for NLG, and the input and predicted sequence were converted to allow dependency tree decoding. The experimental results indicated that our suggested approach, i.e., dependency tree decoding, dramatically elevates the BLEU-score and naturalness. Furthermore, when creating sentences with ${n}$ -best using dependency tree decoding, the word diversity of the output sentences was increased by approximately 6%, offering a more diverse sentence pattern.

Keywords