Paper: Exploiting N-best Hypotheses for SMT Self-Enhancement

ACL ID P08-2040
Title Exploiting N-best Hypotheses for SMT Self-Enhancement
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2008
Authors

Word and n-gram posterior probabilities esti- mated on N-best hypotheses have been used to improve the performance of statistical ma- chine translation (SMT) in a rescoring frame- work. In this paper, we extend the idea to estimate the posterior probabilities on N-best hypotheses for translation phrase-pairs, target language n-grams, and source word re- orderings. The SMT system is self-enhanced with the posterior knowledge learned from N- best hypotheses in a re-decoding framework. Experiments on NIST Chinese-to-English task show performance improvements for all the strategies. Moreover, the combination of the three strategies achieves further improvements and outperforms the baseline by 0.67 BLEU score on NIST-2003 set, and 0.64 on NIST- 2005 set, respectively.