Paper: Learning Better Monolingual Models with Unannotated Bilingual Text

ACL ID W10-2906
Title Learning Better Monolingual Models with Unannotated Bilingual Text
Venue International Conference on Computational Natural Language Learning
Session Main Conference
Year 2010
Authors

This work shows how to improve state-of-the-art monolingual natural language processing models using unannotated bilingual text. We build a mul- tiview learning objective that enforces agreement between monolingual and bilingual models. In our method the first, monolingual view consists of supervised predictors learned separately for each language. The second, bilingual view consists of log-linear predictors learned over both languages on bilingual text. Our training procedure estimates the parameters of the bilingual model using the output of the monolingual model, and we show how to combine the two models to account for depen- dence between views. For the task of named entity recognition, using bilingual predictors increases F1 by 16.1% absolute over a supervised monolingual model, and r...