Paper: Sparsity in Dependency Grammar Induction

ACL ID P10-2036
Title Sparsity in Dependency Grammar Induction
Venue Annual Meeting of the Association of Computational Linguistics
Session Short Paper
Year 2010
Authors

A strong inductive bias is essential in un- supervised grammar induction. We ex- plore a particular sparsity bias in de- pendency grammars that encourages a small number of unique dependency types. Specifically, we investigate sparsity-inducing penalties on the poste- rior distributions of parent-child POS tag pairs in the posterior regularization (PR) framework of Graça et al. (2007). In ex- periments with 12 languages, we achieve substantial gains over the standard expec- tation maximization (EM) baseline, with average improvement in attachment ac- curacy of 6.3%. Further, our method outperforms models based on a standard Bayesian sparsity-inducing prior by an av- erage of 4.9%. On English in particular, we show that our approach improves on several other state-of-the-art techniques.