Transactions of the Association for Computational Linguistics (Jul 2020)
Consistent Unsupervised Estimators for Anchored PCFGs
Abstract
Learning probabilistic context-free grammars (PCFGs) from strings is a classic problem in computational linguistics since Horning ( 1969 ). Here we present an algorithm based on distributional learning that is a consistent estimator for a large class of PCFGs that satisfy certain natural conditions including being anchored (Stratos et al., 2016 ). We proceed via a reparameterization of (top–down) PCFGs that we call a bottom–up weighted context-free grammar. We show that if the grammar is anchored and satisfies additional restrictions on its ambiguity, then the parameters can be directly related to distributional properties of the anchoring strings; we show the asymptotic correctness of a naive estimator and present some simulations using synthetic data that show that algorithms based on this approach have good finite sample behavior.