Communicating Eye Gaze across a Distance without Rooting Participants to the Spot

Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the worldpsilas first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each otherpsilas eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.

[1]  M. Whitton,et al.  Effect of latency on presence in stressful virtual environments , 2003, IEEE Virtual Reality, 2003. Proceedings..

[2]  David S. Kirk,et al.  Comparing remote gesture technologies for supporting collaborative physical tasks , 2006, CHI.

[3]  Robin Wolff,et al.  A Tool for Replay and Analysis of Gaze-Enhanced Multiparty Sessions Captured in Immersive Collaborative Environments , 2008, 2008 12th IEEE/ACM International Symposium on Distributed Simulation and Real-Time Applications.

[4]  Holger Regenbrecht,et al.  Spatiality in videoconferencing: trade-offs between efficiency and social presence , 2006, CSCW '06.

[5]  Rob Aspin,et al.  Exploring the use of local consistency measures as thresholds for dead reckoning update packet generation , 2005, Ninth IEEE International Symposium on Distributed Simulation and Real-Time Applications.

[6]  C. Goodwin Conversational Organization: Interaction Between Speakers and Hearers , 1981 .

[7]  Hideaki Kuzuoka,et al.  GestureCam: a video communication system for sympathetic remote collaboration , 1994, CSCW '94.

[8]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[9]  Norman Murray,et al.  Comparison of head gaze and head and eye gaze within an immersive environment , 2006, 2006 Tenth IEEE International Symposium on Distributed Simulation and Real-Time Applications.

[10]  Norman I. Badler,et al.  Where to Look? Automating Attending Behaviors of Virtual Human Characters , 1999, Agents.

[11]  John C. Tang,et al.  What video can and cannot do for collaboration: A case study , 2005, Multimedia Systems.

[12]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[13]  Roel Vertegaal,et al.  Effects of Gaze on Multiparty Mediated Communication , 2000, Graphics Interface.

[14]  Greg Welch,et al.  The office of the future: a unified approach to image-based modeling and spatially immersive displays , 1998, SIGGRAPH.

[15]  G. Psathas Everyday language : studies in ethnomethodology , 1981 .

[16]  Ilona Heldal,et al.  Factors influencing flow of object focussed collaboration in collaborative virtual environments , 2006, Virtual Reality.

[17]  K. Rayner Eye movements in reading and information processing. , 1978, Psychological bulletin.

[18]  Gene H. Lerner Selecting next speaker: The context-sensitive operation of a context-free organization , 2003, Language in Society.

[19]  John F. Canny,et al.  MultiView: spatially faithful group video conferencing , 2005, CHI.

[20]  Norman I. Badler,et al.  Eyes alive , 2002, ACM Trans. Graph..

[21]  R. Leigh,et al.  The neurology of eye movements , 2006 .

[22]  Luc Van Gool,et al.  Blue-c: a spatially immersive display and 3D video portal for telepresence , 2003, IPT/EGVE.

[23]  Michael Argyle,et al.  The central Europe experiment: Looking at persons and looking at objects , 1976 .

[24]  Yutaka Matsushita,et al.  Multiparty videoconferencing at virtual social distance: MAJIC design , 1994, CSCW '94.

[25]  Roel Vertegaal,et al.  GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction , 2003, CHI '03.

[26]  C. Goodwin Action and embodiment within situated human interaction , 2000 .

[27]  C. Goodwin The Interactive Construction of a Sentence in Natural Conversation , 1979 .

[28]  Bing Pan,et al.  The determinants of web page viewing behavior: an eye-tracking study , 2004, ETRA.

[29]  Hendrik Koesling,et al.  A Preliminary Investigation into Eye Gaze Data in a First Person Shooter Game , 2005 .