Eyetracking for two-person tasks with manipulation of a virtual world

Eyetracking facilities are typically restricted to monitoring a single person viewing static images or prerecorded video. In the present article, we describe a system that makes it possible to study visual attention in coordination with other activity during joint action. The software links two eyetracking systems in parallel and provides an on-screen task. By locating eye movements against dynamic screen regions, it permits automatic tracking of moving on-screen objects. Using existing SR technology, the system can also cross-project each participant’s eyetrack and mouse location onto the other’s on-screen work space. Keeping a complete record of eyetrack and on-screen events in the same format as subsequent human coding, the system permits the analysis of multiple modalities. The software offers new approaches to spontaneous multimodal communication: joint action and joint attention. These capacities are demonstrated using an experimental paradigm for cooperative on-screen assembly of a two-dimensional model. The software is available under an open source license.

[1]  Louise Alston,et al.  Subitization and attentional engagement by transient stimuli. , 2004, Spatial vision.

[2]  Jeremy M Wolfe,et al.  Multiple object juggling: Changing what is tracked during extended multiple object tracking , 2007, Psychonomic bulletin & review.

[3]  Daniel C. Richardson,et al.  The Art of Conversation Is Coordination , 2007, Psychological science.

[4]  Antje S. Meyer,et al.  Application of eye tracking in speech production research , 2003 .

[5]  Stephanie D. Teasley,et al.  Perspectives on socially shared cognition , 1991 .

[6]  G. R. Barnes,et al.  Quantitative differences in smooth pursuit and saccadic eye movements , 2006, Experimental Brain Research.

[7]  Z. Pylyshyn Some puzzling findings in multiple object tracking: I. Tracking without keeping track of object identities , 2004 .

[8]  B. Keysar,et al.  When do speakers take into account common ground? , 1996, Cognition.

[9]  B. Velichkovsky Communicating attention: Gaze position transfer in cooperative problem solving , 1995 .

[10]  G. Altmann,et al.  Incremental interpretation at verbs: restricting the domain of subsequent reference , 1999, Cognition.

[11]  Robin L. Hill,et al.  Who tunes accessibility of referring expressions in task-related dialogue? , 2008 .

[12]  Daniel C. Richardson,et al.  Looking To Understand: The Coupling Between Speakers' and Listeners' Eye Movements and Its Relationship to Discourse Comprehension , 2005, Cogn. Sci..

[13]  Christopher A. Dickinson,et al.  Coordinating cognition: The costs and benefits of shared gaze during collaborative search , 2008, Cognition.

[14]  John D. Kelleher,et al.  The Annual Meeting of the Cognitive Science Society , 2005, Cognitive Systems Research.

[15]  Terry C. Lansdown,et al.  The mind's eye: cognitive and applied aspects of eye movement research , 2005 .

[16]  H. H. Clark,et al.  Common ground at the understanding of demonstrative reference , 1983 .

[17]  N. Charness,et al.  The perceptual aspect of skilled performance in chess: Evidence from eye movements , 2001, Memory & cognition.

[18]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[19]  Pierre Dillenbourg,et al.  Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings , 2008, ETRA.

[20]  Daniel M. Oppenheimer,et al.  Speakers gaze at objects while preparing intentionally inaccurate labels for them. , 2006, Journal of experimental psychology. Learning, memory, and cognition.

[21]  G. Underwood Cognitive processes in eye guidance , 2005 .

[22]  A. Bangerter,et al.  Using Pointing and Describing to Achieve Joint Focus of Attention in Dialogue , 2004, Psychological science.

[23]  Robin L. Hill,et al.  Eye movements : a window on mind and brain , 2007 .

[24]  Z. Griffin Why Look? Reasons for Eye Movements Related to Language Production. , 2004 .

[25]  A. Meyer,et al.  Eye movements during speech planning: Talking about present and remembered objects , 2004 .

[26]  Giovanni Ottoboni,et al.  Human gaze behaviour during action execution and observation. , 2008, Acta psychologica.

[27]  G. Miller,et al.  Cognitive science. , 1981, Science.

[28]  Z. Pylyshyn,et al.  Dynamics of target selection in multiple object tracking (MOT). , 2006, Spatial vision.

[29]  H. H. Clark,et al.  Speaking while monitoring addressees for understanding , 2004 .

[30]  Michael J. Spivey,et al.  Eye Movements and Problem Solving , 2003, Psychological science.

[31]  Robert E Kraut,et al.  Visual copresence and conversational coordination , 2004, Behavioral and Brain Sciences.

[32]  H. H. Clark Pointing and placing. , 2003 .

[33]  S. Brennan,et al.  Addressees' needs influence speakers' early syntactic choices , 2002, Psychonomic bulletin & review.

[34]  M F Land,et al.  The knowledge base of the oculomotor system. , 1997, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[35]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[36]  Robin L. Hill,et al.  Referring and gaze alignment: accessibility is alive and well in situated dialogue , 2009 .

[37]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[38]  Karl G. D. Bailey,et al.  Do speakers and listeners observe the Gricean Maxim of Quantity , 2006 .

[39]  C. Hofsten,et al.  Infants predict other people's action goals , 2006, Nature Neuroscience.

[40]  M. Pickering,et al.  Towards a mechanistic theory of dialog , 2004 .

[41]  P. Todd,et al.  Simple Heuristics That Make Us Smart , 1999 .

[42]  Steven J. Luck,et al.  Visual attention and the binding problem: A neurophysiological perspective , 1998 .

[43]  Stanford Universi Common Ground and the Understanding of Demonstrative Reference , 2004 .

[44]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[45]  喜多 壮太郎 Pointing : where language, culture, and cognition meet , 2013 .

[46]  Matthew W. Crocker,et al.  Gaze alignment of interlocutors in conversational dialogues , 2006, ETRA.

[47]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[48]  M. H. Fischer,et al.  Objects capture perceived gaze direction. , 2006, Experimental psychology.

[49]  Michael J. Spivey,et al.  Oculomotor mechanisms activated by imagery and memory: eye movements to absent objects , 2001, Psychological research.

[50]  G. Altmann,et al.  The real-time mediation of visual attention by language and world knowledge: Linking anticipatory (and other) eye movements to linguistic processing , 2007 .

[51]  Michael K. Tanenhaus,et al.  Pragmatic effects on reference resolution in a collaborative task: evidence from eye movements , 2004, Cogn. Sci..

[52]  Yuki Kamide,et al.  Now you see it, now you don't: mediating the mapping between language and the visual world , 2004 .

[53]  M. Land Eye movements and the control of actions in everyday life , 2006, Progress in Retinal and Eye Research.

[54]  Julie C. Sedivy,et al.  Eye movements and spoken language comprehension: Effects of visual context on syntactic ambiguity resolution , 2002, Cognitive Psychology.

[55]  Zenzi M. Griffin,et al.  Why Look? Reasons for Eye Movements Related to Language Production. , 2004 .

[56]  A. Monk,et al.  A Look Is Worth a Thousand Words: Full Gaze Awareness in Video-Mediated Conversation , 2002 .

[57]  Norman Murray,et al.  Comparison of head gaze and head and eye gaze within an immersive environment , 2006, 2006 Tenth IEEE International Symposium on Distributed Simulation and Real-Time Applications.

[58]  M. Tanenhaus,et al.  Approaches to studying world-situated language use : bridging the language-as-product and language-as-action traditions , 2005 .

[59]  Alois Knoll,et al.  The roles of haptic-ostensive referring expressions in cooperative, task-based human-robot dialogue , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[60]  Robert E. Kraut,et al.  The use of visual information in shared visual spaces: informing the development of virtual co-presence , 2002, CSCW '02.

[61]  G. Barnes,et al.  Cognitive processes involved in smooth pursuit eye movements , 2008, Brain and Cognition.

[62]  James Clark,et al.  XSL Transformations (XSLT) Version 1.0 , 1999 .

[63]  Susan Brennan,et al.  Another person's eye gaze as a cue in solving programming problems , 2004, ICMI '04.

[64]  Herbert H. Clark,et al.  Grounding in communication , 1991, Perspectives on socially shared cognition.

[65]  张维光,et al.  教学影片制作利器——Camtasia Studio , 2006 .

[66]  Steven L Franconeri,et al.  Binding: A Special Issue of the Journal of Visual Cognition , 2009 .

[67]  Roel Vertegaal,et al.  Explaining effects of eye gaze on mediated group conversations:: amount or synchronization? , 2002, CSCW '02.

[68]  Yiya Chen,et al.  Let’s you do that: Sharing the cognitive burdens of dialogue , 2007 .

[69]  Jdm Underwood,et al.  Novice and expert performance with a dynamic control task: scanpaths during a computer game , 2005 .

[70]  Daniel C. Richardson,et al.  Conversation, Gaze Coordination, and Beliefs About Visual Context , 2009, Cogn. Sci..

[71]  A. Bell Language style as audience design , 1984, Language in Society.

[72]  Michael F. Land,et al.  Fixation strategies during active behaviour , 2007 .