Paper: Dependency-Based Word Embeddings

ACL ID P14-2050
Title Dependency-Based Word Embeddings
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2014

While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary con- texts. In particular, we perform exper- iments with dependency-based contexts, and show that they produce markedly different embeddings. The dependency- based embeddings are less topical and ex- hibit more functional similarity than the original skip-gram embeddings.