Spatial distance modulates reading times for sentences about social relations: evidence from eye tracking Ernesto Guerra 1,2 (ernesto.guerra@mpi.nl) Pia Knoeferle 1 (knoeferl@cit-ec.uni-bielefeld.de) Cognitive Interaction Technology Excellence Cluster and Department of Linguistics, Bielefeld University, Inspiration I, 33615, Bielefeld, Germany Max Planck Institute for Psycholinguistics, Wundtlaan 1, Nijmegen, 6525 XD, The Netherlands Keywords: spatial distance, social distance, semantic interpretation, eye tracking. Guerra and Knoeferle (2012) was motivated by a linking hypothesis from Conceptual Metaphor Theory (CMT, Lakoff & Johnson 1999). To accommodate the rapid and incremental effects of spatial distance on semantic interpretation, the authors relied on a mechanism that relates corresponding elements in the sentence and in the visual context by co-indexing them (see the Coordinated Interplay Account, CIA, Knoeferle & Crocker, 2006, 2007). Yet, it is unclear whether spatial distance can rapidly influence processing of other semantic relations besides similarity (see Lakoff & Johnson, 1999). In addition, it remains to be seen how precisely abstract language is co-indexed with spatial distance depicted in the visual context during comprehension. The present study examined spatial distance effects on another abstract domain (social relations), and additionally, assesses the co-indexing between visual cues and abstract language comprehension. Introduction Spatial distance and social relations Recent eye-tracking evidence showed that spatial distance between depicted objects can distinctively modulate reading times for sentences expressing semantic similarity (Guerra & Knoeferle, 2012). Participants inspected objects (playing cards) and then read a sentence about abstract ideas (e.g., „Peace and war are certainly different…‟). Reading times were shorter for sentences expressing similarity between two abstract „and‟-coordinated nouns when the cards were presented close together, compared to farther apart. For sentences expressing dissimilarity the opposite pattern was observed, namely reading times were shorter when cards were presented far apart (vs. close together). These results represent important advances in the understanding of the relation between visual context effects and sentence interpretation. For instance, they suggest visual information can influence abstract-language interpretation – an effect previously shown for concrete language (see, e.g., Tanenhaus et al., 1995). Moreover, they suggest that linguistic and non-linguistic information can interact in the absence of an overt referential link, or lexical association (cf. Altmann & Kamide, 2007; Knoeferle & Crocker, 2007). However, several open questions remain concerning the extent to which spatial distance affects abstract language processing and the mechanisms underlying such effects. The investigation of non-referential visual context effects in In everyday language, people commonly use spatial concepts to communicate aspect of social relations in expressions such as “he‟s a close friend”. The CMT suggests that such expressions arise because abstract representations such as social intimacy are grounded in physical experience such as spatial distance through metaphorical mapping (Lakoff & Johnson 1999). Recent behavioral studies have investigated the link between social and spatial distance. For instance, in a study, Williams and Bargh (2008) found that participants reported weaker bonds to their families and hometowns after they had been primed with far (vs. close) distance (by marking off two points on a Cartesian plane, either far apart or close together). More recently, Matthews and Matlock (2011) found that in a path-drawing task participants drew paths closer to figures described to them as friends (vs. strangers). Another study reported how perceived distance (in a picture with depth perspective, e.g., scenery of alleys with trees) interacted with the content of written words (i.e., friend vs. enemy), modulating response latencies in a distance- estimation and a word-classification task (Bar-Anan et al., 2007). In both of these tasks longer response times emerged when the word friend was presented far away in the picture (compared to close), and the opposite pattern for the word enemy. Abstract Recent evidence from eye tracking during reading showed that non-referential spatial distance presented in a visual context can modulate semantic interpretation of similarity relations rapidly and incrementally. In two eye-tracking reading experiments we extended these findings in two important ways; first, we examined whether other semantic domains (social relations) could also be rapidly influenced by spatial distance during sentence comprehension. Second, we aimed to further specify how abstract language is co-indexed with spatial information by varying the syntactic structure of sentences between experiments. Spatial distance rapidly modulated reading times as a function of the social relation expressed by a sentence. Moreover, our findings suggest that abstract language can be co-indexed as soon as critical information becomes available for the reader.
[1]
Louise Connell,et al.
When does perception facilitate or interfere with conceptual processing? The effect of attentional modulation
,
2012,
Front. Psychology.
[2]
Michael P. Kaschak,et al.
Perception of motion affects language processing
,
2005,
Cognition.
[3]
D. Mirman,et al.
Competition and cooperation among similar representations: toward a unified account of facilitative and inhibitory effects of lexical neighbors.
,
2012,
Psychological review.
[4]
Julie C. Sedivy,et al.
Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning
,
1995
.
[5]
J. Bargh,et al.
Keeping One's Distance
,
2008,
Psychological science.
[6]
D. Barr,et al.
Random effects structure for confirmatory hypothesis testing: Keep it maximal.
,
2013,
Journal of memory and language.
[7]
Daniel C. Richardson,et al.
Spatial representations activated during real-time comprehension of verbs
,
2003,
Cogn. Sci..
[8]
K. Rayner.
The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search
,
2009,
Quarterly journal of experimental psychology.
[9]
Rolf A. Zwaan,et al.
Revisiting Mental Simulation in Language Comprehension: Six Replication Attempts
,
2012,
PloS one.
[10]
L. Connell.
Representing object colour in language comprehension
,
2007,
Cognition.
[11]
Pia Knoeferle,et al.
Abstract language comprehension is incrementally modulated by non-referential spatial information: evidence from eye-tracking
,
2012,
CogSci.
[12]
Matthew W. Crocker,et al.
The influence of recent scene events on spoken comprehension: Evidence from eye movements
,
2007
.
[13]
G. Lakoff,et al.
Philosophy in the flesh : the embodied mind and its challenge to Western thought
,
1999
.
[14]
Yaacov Trope,et al.
Automatic processing of psychological distance: evidence from a Stroop task.
,
2007,
Journal of experimental psychology. General.
[15]
Frank Keller,et al.
Syntactic priming in comprehension: Parallelism effects with and without coordination
,
2010
.
[16]
G. Altmann.
Language-mediated eye movements in the absence of a visual world: the ‘blank screen paradigm’
,
2004,
Cognition.
[17]
Justin L. Matthews,et al.
Understanding the Link Between Spatial Distance and Social Distance
,
2011
.
[18]
G. Altmann,et al.
The real-time mediation of visual attention by language and world knowledge: Linking anticipatory (and other) eye movements to linguistic processing
,
2007
.