Abstract language comprehension is incrementally modulated by non-referential spatial information: evidence from eye-tracking Ernesto Guerra (ernesto.guerra@uni-bielefeld.de) Pia Knoeferle (knoeferl@cit-ec.uni-bielefeld.de) Cognitive Interaction Technology Excellence Cluster, Bielefeld University, Morgenbreede 39, 33615, Bielefeld, Germany. Abstract 2011; Huettig & McQueen, 2007). Visual context not only affects spoken language comprehension rapidly, but also sentence comprehension during reading. Evidence from picture-sentence verification has revealed rapid visual context effects for concrete visual stimuli (e.g., red dots) and sentence content (e.g., The dots are red, see Clark & Chase, 1972; also Gough, 1965; Knoeferle, Urbach, & Kutas, 2011; Underwood, Jebbet, & Roberts, 2004). However, most of these studies have concentrated on sentences about concrete objects and events. While evidence suggests that visual context can rapidly and incrementally inform comprehension of concrete spoken and written sentences, it is unclear to which extent non-linguistic visual context information can influence the processing of abstract language rapidly and incrementally. In examining situated language comprehension, most visual world studies have further relied on a referential linking hypothesis (e.g., a noun referencing an object or a verb an action). By contrast, it’s unclear whether visually presented information can influence sentence comprehension when there is no overt referential or lexical-semantic link with sentence content. Research on situated language processing has examined how visually depicted objects or concrete action events inform the comprehension of concrete sentences. By contrast, much less is known about how abstract sentence comprehension interacts with non-linguistic visual information. Moreover, while non-linguistic information can rapidly inform language comprehension when it is related to sentence content through reference or lexical-semantic associations, it is unclear to which extent this is the case when the visual context is ‘non- referential’ (i.e., not related to the sentence through reference or lexical semantic associations). We conducted two eye- tracking reading experiments to address these two open issues. In both experiments, reading times were shorter when sentences about conceptually similar abstract ideas were preceded by objects (words-on-cards in Experiment 1 and blank playing cards in Experiment 2) that were depicted close together (vs. far apart); and when sentences about conceptually dissimilar abstract ideas were preceded by objects that were depicted far apart (vs. close together). This happened rapidly (first-pass reading times) and incrementally (as the sentence unfolded). Thus, (a) comprehension of abstract language can be modulated by non-linguistic visual information (spatial distance between depicted objects) at the sentence level, and (b) online language comprehension can be informed by visual context even in the absence of an overt referential or lexical-semantic link. Spatial Distance and Semantic Similarity Keywords: semantic interpretation; spatial information; non- referential visual context; eye tracking. Introduction Studies in the ‘visual world paradigm’ have contributed extensively to our understanding of how non-linguistic visual information affects sentence comprehension (e.g., syntactic disambiguation: Tanenhaus et al., 1995; semantic interpretation: Sedivy et al., 1999). In ‘visual world studies’, listener’s eye movements are tracked during comprehension of a spoken sentence that describes a given visual environment. Findings from such studies have shown that visual presentation of objects or concrete action events can facilitate incremental structural disambiguation (e.g., Tanenhaus et al., 1995; Knoeferle, Crocker, Scheepers, & Pickering, 2005); that language can rapidly guide visual attention to semantically relevant objects as evidenced by anticipatory eye-movements (e.g., Altmann & Kamide 1999; Kamide, Scheepers, & Altmann, 2003, Kamide, Altmann, & Haywood, 2003); and that distractor objects are inspected more often when they are semantically related (vs. unrelated) to a target word (e.g., Huettig & Altmann, 2005, Conceptual metaphor theory proposes that abstract meaning is grounded in physical experience through metaphorical mapping (Lakoff & Johnson, 1999). Similarity, for instance, would be grounded in the physical experience of spatial distance. Recent behavioral studies have provided first evidence for a link between spatial distance and similarity. In one study, two visually presented abstract words (e.g., loyalty and boredom) were judged to be more similar when they were presented close together (vs. far apart), but more dissimilar when they were presented far apart (vs. close together, Casasanto, 2008). In another, similarity-judgment task (on whether two squares on a screen had similar colors or not) speeded decision times were shorter when similarly- colored squares were presented close to each other (vs. far apart), and when differently-colored squares were presented far apart (vs. close to each other, Boot & Pecher, 2010). These rating and response time effects support the view that there is a relationship of some sort between spatial information (the distance between two stimuli) and semantic and visual similarity.
[1]
Herbert H. Clark,et al.
On the process of comparing sentences against pictures
,
1972
.
[2]
K. Rayner.
Eye movements in reading and information processing: 20 years of research.
,
1998,
Psychological bulletin.
[3]
Matthew W. Crocker,et al.
The Coordinated Interplay of Scene, Utterance, and World Knowledge: Evidence From Eye Tracking
,
2006,
Cogn. Sci..
[4]
G. Altmann,et al.
Looking at anything that is green when hearing “frog”: How object surface colour and stored object colour knowledge influence language-mediated overt attention
,
2011,
Quarterly journal of experimental psychology.
[5]
Julie C. Sedivy,et al.
Achieving incremental semantic interpretation through contextual representation
,
1999,
Cognition.
[6]
Falk Huettig,et al.
The tug of war between phonological, semantic and shape information in language-mediated visual search
,
2007
.
[7]
S. Vereza.
Philosophy in the flesh: the embodied mind and its challenge to Western thought
,
2001
.
[8]
Geoffrey Underwood,et al.
Inspecting Pictures for Information to Verify a Sentence: Eye Movements in General Encoding and in Focused Search
,
2004,
The Quarterly journal of experimental psychology. A, Human experimental psychology.
[9]
Matthew W. Crocker,et al.
The influence of recent scene events on spoken comprehension: Evidence from eye movements
,
2007
.
[10]
Matthew W. Crocker,et al.
The influence of the immediate visual context on incremental thematic role-assignment: evidence from eye-movements in depicted events
,
2005,
Cognition.
[11]
Marta Kutas,et al.
Comprehending how visual context influences incremental sentence processing: Insights from ERPs and picture-sentence verification.
,
2011,
Psychophysiology.
[12]
G. Altmann,et al.
The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye-movements
,
2003
.
[13]
G. Altmann,et al.
Incremental interpretation at verbs: restricting the domain of subsequent reference
,
1999,
Cognition.
[14]
Philip B. Gough,et al.
Grammatical transformations and speed of understanding
,
1965
.
[15]
Christoph Scheepers,et al.
Integration of Syntactic and Semantic Information in Predictive Processing: Cross-Linguistic Evidence from German and English
,
2003,
Journal of psycholinguistic research.
[16]
D. Casasanto,et al.
Similarity and proximity: When does close in space mean close in mind?
,
2008,
Memory & cognition.
[17]
Christoph Scheepers,et al.
Qualitative differences in the representation of abstract versus concrete words: Evidence from the visual-world paradigm
,
2009,
Cognition.
[18]
G. Altmann,et al.
Word meaning and the control of eye fixation: semantic competitor effects and the visual world paradigm
,
2005,
Cognition.
[19]
Julie C. Sedivy,et al.
Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning
,
1995
.
[20]
Diane Pecher,et al.
Similarity is closeness: Metaphorical mapping in a conceptual task
,
2010,
Quarterly journal of experimental psychology.
[21]
Pia Knoeferle,et al.
Effects of speaker gaze on spoken language comprehension: task matters
,
2011,
CogSci.