Computational Linguistics (Jun 2021)

Approximating Probabilistic Models as Weighted Finite Automata

  • Ananda Theertha Suresh,
  • Brian Roark,
  • Michael Riley,
  • Vlad Schogol

DOI
https://doi.org/10.1162/coli_a_00401
Journal volume & issue
Vol. 47, no. 2
pp. 221 – 254

Abstract

Read online

AbstractWeighted finite automata (WFAs) are often used to represent probabilistic models, such as n-gram language models, because among other things, they are efficient for recognition tasks in time and space. The probabilistic source to be represented as a WFA, however, may come in many forms. Given a generic probabilistic model over sequences, we propose an algorithm to approximate it as a WFA such that the Kullback-Leibler divergence between the source model and the WFA target model is minimized. The proposed algorithm involves a counting step and a difference of convex optimization step, both of which can be performed efficiently. We demonstrate the usefulness of our approach on various tasks, including distilling n-gram models from neural models, building compact language models, and building open-vocabulary character models. The algorithms used for these experiments are available in an open-source software library.