Paper: Opinion Mining with Deep Recurrent Neural Networks

ACL ID D14-1080
Title Opinion Mining with Deep Recurrent Neural Networks
Venue Conference on Empirical Methods in Natural Language Processing
Session Main Conference
Year 2014

Recurrent neural networks (RNNs) are con- nectionist models of sequential data that are naturally applicable to the analysis of natural language. Recently, ?depth in space? ? as an orthogonal notion to ?depth in time? ? in RNNs has been investigated by stacking mul- tiple layers of RNNs and shown empirically to bring a temporal hierarchy to the architec- ture. In this work we apply these deep RNNs to the task of opinion expression extraction formulated as a token-level sequence-labeling task. Experimental results show that deep, narrow RNNs outperform traditional shallow, wide RNNs with the same number of parame- ters. Furthermore, our approach outperforms previous CRF-based baselines, including the state-of-the-art semi-Markov CRF model, and does so without access to the powerful opinion ...