Paper: Shrinking Exponential Language Models

ACL ID N09-1053
Title Shrinking Exponential Language Models
Venue Human Language Technologies
Session Main Conference
Year 2009
Authors

In (Chen, 2009), we show that for a vari- ety of language models belonging to the ex- ponential family, the test set cross-entropy of a model can be accurately predicted from its training set cross-entropy and its parameter values. In this work, we show how this rela- tionship can be used to motivate two heuristics for “shrinking” the size of a language model to improve its performance. We use the first heuristic to develop a novel class-based lan- guage model that outperforms a baseline word trigram model by 28% in perplexity and 1.9% absolute in speech recognition word-error rate on Wall Street Journal data. We use the second heuristic to motivate a regularized version of minimum discrimination information models and show that this method outperforms other techniques for domain adapt...