Spoken Word Recognition in the Visual World Paradigm Reflects the Structure of the Entire Lexicon

When subjects are asked to move items in a visual display in response to spoken instructions, their eye movements are closely time-locked to the unfolding speech signal. A recently developed eye-tracking method, the “visual world paradigm”, exploits this phenomenon to provide a sensitive, continuous measure of ambiguity resolution in language processing phenomena, including competition effects in spoken word recognition (Tanenhaus, SpiveyKnowlton, Eberhard, & Sedivy, 1995). With this method, competition is typically measured between names of objects which are simultaneously displayed in front of the subject. This means that fixation probabilities may not reflect competition within the entire lexicon, but only that among items which become active because they are displayed simultaneously. To test this, we created a small, artificial lexicon with specific lexical similarity characteristics. Subjects learned novel names for 16 novel geometric objects. Objects were presented with high, medium or low frequency during training. Each lexical item had two potential competitors. The crucial comparison was between high-frequency items which had either high- or low-frequency competitors. In spoken word recognition, performance is correlated with the number of frequencyweighted neighbors (phonologically similar words) a word has, suggesting that neighbors compete for recognition as a function of frequency and similarity (e.g., Luce & Pisoni, 1998). We found that in the visual world paradigm, fixation probabilities for items with high-frequency neighbors were delayed compared to those for items with low-frequency neighbors, even when the items were presented with unrelated items. This indicates that fixation probabilities reflect the internal structure of the lexicon, and not just the characteristics of displayed items.

[1]  James L. McClelland,et al.  The TRACE model of speech perception , 1986, Cognitive Psychology.

[2]  W. Marslen-Wilson,et al.  Accessing Spoken Words: The Importance of Word Onsets , 1989 .

[3]  Eileen Kowler Eye movements and their role in visual and cognitive processes. , 1990, Reviews of oculomotor research.

[4]  P. Viviani Eye movements in visual search: cognitive, perceptual and motor control aspects. , 1990, Reviews of oculomotor research.

[5]  Richard Shillcock,et al.  Cognitive models of speech processing : the Second Sperlonga Meeting , 1993 .

[6]  Matthew Flatt,et al.  PsyScope: An interactive graphic system for designing and controlling experiments in the psychology laboratory using Macintosh computers , 1993 .

[7]  Dawn G. Blasko,et al.  Do the Beginnings of Spoken Words Have a Special Status in Auditory Word Recognition , 1993 .

[8]  K. Boff,et al.  Saccadic overhead: Information-processing time with and without saccades , 1993, Perception & psychophysics.

[9]  W Marslen-Wilson,et al.  Levels of perceptual representation and process in lexical access: words, phonemes, and features. , 1994, Psychological review.

[10]  Maryellen C. MacDonald,et al.  The lexical nature of syntactic ambiguity resolution , 1994 .

[11]  S. Blumstein,et al.  The effect of subphonetic differences on lexical access , 1994, Cognition.

[12]  John C. Trueswell,et al.  Chapter 7 – Sentence Comprehension , 1995 .

[13]  Julie C. Sedivy,et al.  Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning , 1995 .

[14]  Paul D. Allopenna,et al.  Tracking the Time Course of Spoken Word Recognition Using Eye Movements: Evidence for Continuous Mapping Models , 1998 .

[15]  J. Magnuson Using an Artificial Lexicon and Eye Movements to Examine the Development and Microstructure of Lexical Dynamics , 1998 .

[16]  D. Pisoni,et al.  Recognizing Spoken Words: The Neighborhood Activation Model , 1998, Ear and hearing.