Paper: Semantic Compositionality through Recursive Matrix-Vector Spaces

ACL ID D12-1110
Title Semantic Compositionality through Recursive Matrix-Vector Spaces
Venue Conference on Empirical Methods in Natural Language Processing
Session Main Conference
Year 2012
Authors

Single-word vector space models have been very successful at learning lexical informa- tion. However, they cannot capture the com- positional meaning of longer phrases, prevent- ing them from a deeper understanding of lan- guage. We introduce a recursive neural net- work (RNN) model that learns compositional vector representations for phrases and sen- tences of arbitrary syntactic type and length. Our model assigns a vector and a matrix to ev- ery node in a parse tree: the vector captures the inherent meaning of the constituent, while the matrix captures how it changes the mean- ing of neighboring words or phrases. This matrix-vector RNN can learn the meaning of operators in propositional logic and natural language. The model obtains state of the art performance on three different experime...