Paper: Bleu: A Method For Automatic Evaluation Of Machine Translation

ACL ID P02-1040
Title Bleu: A Method For Automatic Evaluation Of Machine Translation
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2002
Authors

Human evaluations of machine translation are extensive but expensive. Human eval- uations can take months to finish and in- volve human labor that can not be reused. We propose a method of automatic ma- chine translation evaluation that is quick, inexpensive, and language-independent, that correlates highly with human evalu- ation, and that has little marginal cost per run.