Paper: Integrating surprisal and uncertain-input models in online sentence comprehension: formal techniques and empirical results

ACL ID P11-1106
Title Integrating surprisal and uncertain-input models in online sentence comprehension: formal techniques and empirical results
Venue Annual Meeting of the Association of Computational Linguistics
Session Main Conference
Year 2011
Authors

A system making optimal use of available in- formation in incremental language compre- hension might be expected to use linguistic knowledge together with current input to re- vise beliefs about previous input. Under some circumstances, such an error-correction capa- bility might induce comprehenders to adopt grammatical analyses that are inconsistent with the true input. Here we present a for- mal model of how such input-unfaithful gar- den paths may be adopted and the difficulty incurred by their subsequent disconfirmation, combining a rational noisy-channel model of syntactic comprehension under uncertain in- put with the surprisal theory of incremental processing difficulty. We also present a behav- ioral experiment confirming the key empirical predictions of the theory.