Paper: Can Markov Models Over Minimal Translation Units Help Phrase-Based SMT?

ACL ID P13-2071
Title Can Markov Models Over Minimal Translation Units Help Phrase-Based SMT?
Venue Annual Meeting of the Association of Computational Linguistics
Session Short Paper
Year 2013
Authors

The phrase-based and N-gram-based SMT frameworks complement each other. While the former is better able to memo- rize, the latter provides a more principled model that captures dependencies across phrasal boundaries. Some work has been done to combine insights from these two frameworks. A recent successful attempt showed the advantage of using phrase- based search on top of an N-gram-based model. We probe this question in the reverse direction by investigating whether integrating N-gram-based translation and reordering models into a phrase-based decoder helps overcome the problematic phrasal independence assumption. A large scale evaluation over 8 language pairs shows that performance does significantly improve.