A model of local coherence effects in human sentence processing as consequences of updates from bottom-up prior to posterior beliefs

Human sentence processing involves integrating probabilistic knowledge from a variety of sources in order to incrementally determine the hierarchical structure for the serial input stream. While a large number of sentence processing effects have been explained in terms of comprehenders' rational use of probabilistic information, effects of local coherences have not. We present here a new model of local coherences, viewing them as resulting from a belief-update process, and show that the relevant probabilities in our model are calculable from a probabilistic Earley parser. Finally, we demonstrate empirically that an implemented version of the model makes the correct predictions for the materials from the original experiment demonstrating local coherence effects.

[1]  Julie C. Sedivy,et al.  Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning , 1995 .

[2]  Daniel Jurafsky,et al.  A Bayesian Model Predicts Human Parse Preference and Reading Times in Sentence Processing , 2001, NIPS.

[3]  Daniel C. Richardson,et al.  Effects of merely local syntactic coherence on sentence processing , 2004 .

[4]  Frank Keller,et al.  The Entropy Rate Principle as a Predictor of Processing Effort: An Evaluation against Eye-tracking Data , 2004, EMNLP.

[5]  Roger Levy,et al.  A noisy-channel model of rational human sentence comprehension under uncertain input , 2008, EMNLP 2008.

[6]  G. Altmann,et al.  Incremental interpretation at verbs: restricting the domain of subsequent reference , 1999, Cognition.

[7]  Roger Levy,et al.  Speakers optimize information density through syntactic reduction , 2006, NIPS.

[8]  Thomas L. Griffiths,et al.  Modeling the effects of memory on human online sentence processing with particle filters , 2008, NIPS.

[9]  Daniel Jurafsky,et al.  A Probabilistic Model of Lexical and Syntactic Access and Disambiguation , 1996, Cogn. Sci..

[10]  Eugene Charniak,et al.  Variation of Entropy and Parse Trees of Sentences as a Function of the Sentence Number , 2003, EMNLP.

[11]  John Hale,et al.  A Probabilistic Earley Parser as a Psycholinguistic Model , 2001, NAACL.

[12]  W. Tabor,et al.  Evidence for self-organized sentence processing: digging-in effects. , 2004, Journal of experimental psychology. Learning, memory, and cognition.

[13]  Eugene Charniak,et al.  Entropy Rate Constancy in Text , 2002, ACL.

[14]  Edward Gibson,et al.  The Interaction of Top-Down and Bottom-Up Statistics in the Resolution of Syntactic Category Ambiguity. , 2006 .

[15]  Jay Earley,et al.  An efficient context-free parsing algorithm , 1970, Commun. ACM.

[16]  Andreas Stolcke,et al.  An Efficient Probabilistic Context-Free Parsing Algorithm that Computes Prefix Probabilities , 1994, CL.

[17]  R. Levy Expectation-based syntactic comprehension , 2008, Cognition.