Paper: Representing words as regions in vector space

ACL ID W09-1109
Title Representing words as regions in vector space
Venue International Conference on Computational Natural Language Learning
Session Main Conference
Year 2009
Authors
  • Katrin Erk (University of Texas at Austin, Austin TX)

Vector space models of word meaning typi- callyrepresentthemeaningofawordasavec- tor computed by summing over all its corpus occurrences. Wordsclosetothispointinspace can be assumed to be similar to it in meaning. But how far around this point does the region of similar meaning extend? In this paper we discuss two models that represent word mean- ing as regions in vector space. Both represen- tations can be computed from traditional point representations in vector space. We find that both models perform at over 95% F-score on a token classification task.