IEEE Access (Jan 2024)

Injecting Linguistic Knowledge Into BERT for Dialogue State Tracking

  • Xiaohan Feng,
  • Xixin Wu,
  • Helen Meng

DOI
https://doi.org/10.1109/ACCESS.2024.3423452
Journal volume & issue
Vol. 12
pp. 93761 – 93770

Abstract

Read online

Dialogue State Tracking (DST) models often employ intricate neural network architectures, necessitating substantial training data, and their inference process lacks transparency. This paper proposes a method that extracts linguistic knowledge via an unsupervised framework and subsequently utilizes this knowledge to augment BERT’s performance and interpretability in DST tasks. The knowledge extraction procedure is computationally economical and does not require annotations or additional training data. The injection of the extracted knowledge can be achieved by the addition of simple neural modules. We employ the Convex Polytopic Model (CPM) as a feature extraction tool for DST tasks and illustrate that the acquired features correlate with syntactic and semantic patterns in the dialogues. This correlation facilitates a comprehensive understanding of the linguistic features influencing the DST model’s decision-making process. We benchmark this framework on various DST tasks and observe a notable improvement in accuracy.

Keywords