Paper: How much do word embeddings encode about syntax?

ACL ID P14-2133
Title How much do word embeddings encode about syntax?
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2014
Authors

Do continuous word embeddings encode any useful information for constituency parsing? We isolate three ways in which word embeddings might augment a state- of-the-art statistical parser: by connecting out-of-vocabulary words to known ones, by encouraging common behavior among related in-vocabulary words, and by di- rectly providing features for the lexicon. We test each of these hypotheses with a targeted change to a state-of-the-art base- line. Despite small gains on extremely small supervised training sets, we find that extra information from embeddings appears to make little or no difference to a parser with adequate training data. Our results support an overall hypothe- sis that word embeddings import syntac- tic information that is ultimately redun- dant with distinctions learned from...