Paper: Tailoring Continuous Word Representations for Dependency Parsing

ACL ID P14-2131
Title Tailoring Continuous Word Representations for Dependency Parsing
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2014
Authors

Word representations have proven useful for many NLP tasks, e.g., Brown clusters as features in dependency parsing (Koo et al., 2008). In this paper, we investigate the use of continuous word representations as features for dependency parsing. We com- pare several popular embeddings to Brown clusters, via multiple types of features, in both news and web domains. We find that all embeddings yield significant parsing gains, including some recent ones that can be trained in a fraction of the time of oth- ers. Explicitly tailoring the representations for the task leads to further improvements. Moreover, an ensemble of all representa- tions achieves the best results, suggesting their complementarity.