Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task

Helpers providing guidance for collaborative physical tasks shift their gaze between the workspace, supply area, and instructions. Understanding when and why helpers gaze at each area is important both for a theoretical understanding of collaboration on physical tasks and for the design of automated video systems for remote collaboration. In a laboratory experiment using a collaborative puzzle task, we recorded helpers' gaze while manipulating task complexity and piece differentiability. Helpers gazed toward the pieces bay more frequently when pieces were difficult to differentiate and less frequently over repeated trials. Preliminary analyses of message content show that helpers tend to look at the pieces bay when describing the next piece and at the workspace when describing where it goes. The results are consistent with a grounding model of communication, in which helpers seek visual evidence of understanding unless they are confident that they have been understood. The results also suggest the feasibility of building automated video systems based on remote helpers' shifting visual requirements.

[1]  Jonathan J. Cadiz,et al.  "Let There Be Light": Examining Interfaces for Homes of the Future , 2001, INTERACT.

[2]  Xiaolan Fu,et al.  Video helps remote work: speakers who need to negotiate common ground benefit from seeing each other , 1999, CHI '99.

[3]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[4]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[5]  E. Tiryakian,et al.  Studies in social interaction , 1972 .

[6]  Hideaki Kuzuoka,et al.  GestureMan: a mobile robot that embodies a remote instructor's actions , 2000, CSCW '00.

[7]  Robert E. Kraut,et al.  Persistence matters: making the most of chat in tightly-coupled work , 2004, CHI.

[8]  F. C. Bakker,et al.  Shedding some light on catching in the dark: perceptual mechanisms for catching fly balls. , 1999, Journal of experimental psychology. Human perception and performance.

[9]  Julie C. Sedivy,et al.  Eye movements as a window into real-time spoken language comprehension in natural contexts , 1995, Journal of psycholinguistic research.

[10]  B. Brumitt Comparing Interfaces for Homes of the Future , 2000 .

[11]  Abigail Sellen,et al.  One is not enough: multiple views in a media space , 1993, INTERCHI.

[12]  Marianne Gullberg,et al.  Eye movements and gestures in human face-to-face interaction , 2003 .

[13]  M. Tanenhaus,et al.  Circumscribing Referential Domains during Real-Time Language Comprehension , 2002 .

[14]  Robert E. Kraut,et al.  Controlling interruptions: awareness displays and social motivation for coordination , 2004, CSCW.

[15]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[16]  Susan R. Fussell,et al.  Where do helpers look?: gaze targets during collaborative physical tasks , 2003, CHI Extended Abstracts.

[17]  D. Barr,et al.  Taking Perspective in Conversation: The Role of Mutual Knowledge in Comprehension , 2000, Psychological science.

[18]  Robert E. Kraut,et al.  Coordination of communication: effects of shared visual context on collaborative work , 2000, CSCW '00.

[19]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[20]  J. Vickers,et al.  Visual control when aiming at a far target. , 1996, Journal of experimental psychology. Human perception and performance.

[21]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[22]  Cecilia E. Ford Collaborative Construction of Task Activity: Coordinating Multiple Resources in a High School Physics Lab , 1999 .

[23]  K. Preston White,et al.  Eye-gaze word processing , 1990, IEEE Trans. Syst. Man Cybern..

[24]  Xilin Chen,et al.  Gestural communication over video stream: supporting multimodal interaction for remote collaborative physical tasks , 2003, ICMI '03.

[25]  Beth Ann Hockey,et al.  Using eye movements to determine referents in a spoken dialogue system , 2001, PUI '01.

[26]  Robert E. Kraut,et al.  The use of visual information in shared visual spaces: informing the development of virtual co-presence , 2002, CSCW '02.

[27]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[28]  Alexander H. Waibel,et al.  Modeling focus of attention for meeting indexing based on multiple cues , 2002, IEEE Trans. Neural Networks.

[29]  Robert E. Kraut,et al.  Action as language in a shared visual space , 2004, CSCW.

[30]  Robert E. Kraut,et al.  Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks , 2003, CHI '03.

[31]  Andrew F. Monk,et al.  Some advantages of video conferencing over high-quality audio conferencing: fluency and awareness of attentional focus , 1998, Int. J. Hum. Comput. Stud..

[32]  H. H. Clark,et al.  Referring as a collaborative process , 1986, Cognition.

[33]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[34]  Shumin Zhai,et al.  Gaze and Speech in Attentive User Interfaces , 2000, ICMI.

[35]  Hideaki Kuzuoka,et al.  GestureCam: a video communication system for sympathetic remote collaboration , 1994, CSCW '94.

[36]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[37]  Dario D. Salvucci Inferring intent in eye-based interfaces: tracing eye movements with process models , 1999, CHI '99.

[38]  Steve Whittaker,et al.  The role of vision in face-to-face and mediated communication. , 1997 .

[39]  John C. Tang Findings from Observational Studies of Collaborative Work , 1991, Int. J. Man Mach. Stud..

[40]  Fionn Murtagh,et al.  Computer display control and interaction using eye‐gaze , 2002 .

[41]  Anton Nijholt,et al.  Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes , 2001, CHI.

[42]  Robert E. Kraut,et al.  Collaboration in performance of physical tasks: effects on outcomes and communication , 1996, CSCW '96.

[43]  B. Asher The Professional Vision , 1994 .