Paper: Attention Shifting For Parsing Speech

ACL ID P04-1006
Title Attention Shifting For Parsing Speech
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2004
Authors

We present a technique that improves the efficiency of word-lattice parsing as used in speech recogni- tion language modeling. Our technique applies a probabilistic parser iteratively where on each iter- ation it focuses on a different subset of the word- lattice. The parser’s attention is shifted towards word-lattice subsets for which there are few or no syntactic analyses posited. This attention-shifting technique provides a six-times increase in speed (measured as the number of parser analyses evalu- ated) while performing equivalently when used as the first-stage of a multi-stage parsing-based lan- guage model.