IEEE Access (Jan 2019)

A Novel Slot-Gated Model Combined With a Key Verb Context Feature for Task Request Understanding by Service Robots

  • Shuyou Zhang,
  • Junjie Jiang,
  • Zaixing He,
  • Xinyue Zhao,
  • Jinhui Fang

DOI
https://doi.org/10.1109/ACCESS.2019.2931576
Journal volume & issue
Vol. 7
pp. 105937 – 105947

Abstract

Read online

Spoken language understanding (SLU) is a fundamental to service robot handling of natural language task requests. There are two main basic problems in SLU, namely, intent determination (ID) and slot filling (SF). The slot-gated recurrent neural network joint model for the two tasks has been proven to be superior to the single model, and has achieved the most advanced performance. However, in the context of task requests for home service robots, there exists a phenomenon that the information about a current word is strongly dependent on key verbs in the sentence, and it is difficult to capture this relation well with current methods. In this paper, we extract the key instructional verb containing greater core task information based on dependency parsing, and construct a feature that combines the key verb with its contextual information to solve this problem. To further improve the performance of the slot-gated model, we consider the strong relations between intent and slot. By introducing intent attention vectors into the slot attention vectors through the global-level gate and element-level gate, a novel dual slot-gated mechanism is proposed to explicitly model the complex relations between the results of the ID tasks and SF prediction tasks and optimize the global prediction results. Our experimental results on the ATIS dataset and an extended home service task (SRTR) dataset based on FrameNet show that the proposed method outperforms the most advanced methods in both tasks. Especially, for SRTR, the results of SF, ID, and sentence-level semantic frame-filling are improved by 1.7%, 1.1%, and 1.7%, respectively.

Keywords