Paper: Aligning context-based statistical models of language with brain activity during reading

ACL ID D14-1030
Title Aligning context-based statistical models of language with brain activity during reading
Venue Conference on Empirical Methods in Natural Language Processing
Session Main Conference
Year 2014
Authors

Many statistical models for natural language pro- cessing exist, including context-based neural net- works that (1) model the previously seen context as a latent feature vector, (2) integrate successive words into the context using some learned represen- tation (embedding), and (3) compute output proba- bilities for incoming words given the context. On the other hand, brain imaging studies have sug- gested that during reading, the brain (a) continu- ously builds a context from the successive words and every time it encounters a word it (b) fetches its properties from memory and (c) integrates it with the previous context with a degree of effort that is inversely proportional to how probable the word is. This hints to a parallelism between the neural net- works and the brain in modeling con...