Paper: Learning to Prune: Context-Sensitive Pruning for Syntactic MT

ACL ID P13-2063
Title Learning to Prune: Context-Sensitive Pruning for Syntactic MT
Venue Annual Meeting of the Association of Computational Linguistics
Session Short Paper
Year 2013
Authors

We present a context-sensitive chart prun- ing method for CKY-style MT decoding. Source phrases that are unlikely to have aligned target constituents are identified using sequence labellers learned from the parallel corpus, and speed-up is obtained by pruning corresponding chart cells. The proposed method is easy to implement, or- thogonal to cube pruning and additive to its pruning power. On a full-scale English- to-German experiment with a string-to- tree model, we obtain a speed-up of more than 60% over a strong baseline, with no loss in BLEU.