Some Implications of Eye Gaze Behavior and Perception for the Design of Immersive Telecommunication Systems

A feature of standard video-mediated Communication systems (VMC) is that participants see into each other's spaces from the viewpoint of a camera. Consequently, participants' capacity to use the spatially-based resources that exist in co-located settings (eg the production and comprehension of pointing and eye gaze direction) can be compromised. Whilst positioning cameras close to displays, or switching or interpolating between multiple cameras to provide appropriately aligned views can reduce this problem, an alternative paradigm is the use of immersive projection technology to locate participants within an immersive collaborative virtual environment (ICVE), in which remote participants appear as 3D graphical representations. Two approaches toward representation of remote participants in ICVEs have been studied: embodied avatars animated using participants' tracked body motion, and vision-based techniques that reconstruct 3D models from multiple streams of live video input. Drawing on empirical evaluations of an avatar-based ICVE system that both captures and displays eye-movement, together with an examination of previous research into gaze, we provide a specification of gaze practices and the cues used in the perception of gaze that should be supported in ICVEs. We delineate some of the challenges for vision-based ICVE and discuss the potential for combining different approaches in the development of such systems.

[1]  J. Rae,et al.  Organizing Participation in Interaction: Doing Participation Framework , 2001 .

[2]  Oliver Schreer,et al.  A modular approach to virtual view creation for a scalable immersive teleconferencing configuration , 2003, Proceedings 2003 International Conference on Image Processing (Cat. No.03CH37429).

[3]  A. Laurentini,et al.  The Visual Hull Concept for Silhouette-Based Image Understanding , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  S. Anstis,et al.  The perception of where a face or television "portrait" is looking. , 1969, The American journal of psychology.

[5]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[6]  Anthony Steed,et al.  An assessment of eye-gaze potential within immersive virtual environments , 2007, TOMCCAP.

[7]  J. Gibson,et al.  Perception of another person's looking behavior. , 1963, The American journal of psychology.

[8]  C. Goodwin The Interactive Construction of a Sentence in Natural Conversation , 1979 .

[9]  Robin Wolff,et al.  Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments , 2009, 2009 IEEE Virtual Reality Conference.

[10]  Yutaka Matsushita,et al.  Multiparty videoconferencing at virtual social distance: MAJIC design , 1994, CSCW '94.

[11]  Robin Wolff,et al.  Communicating Eye Gaze across a Distance without Rooting Participants to the Spot , 2008, 2008 12th IEEE/ACM International Symposium on Distributed Simulation and Real-Time Applications.

[12]  Luc Van Gool,et al.  Blue-c: a spatially immersive display and 3D video portal for telepresence , 2003, IPT/EGVE.

[13]  Oliver Otto,et al.  Constructing a Gazebo: Supporting Teamwork in a Tightly Coupled, Distributed Task in Virtual Reality , 2003, Presence: Teleoperators & Virtual Environments.

[14]  Bruce G. Baumgart A polyhedron representation for computer vision , 1975, AFIPS '75.

[15]  Roel Vertegaal,et al.  GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction , 2003, CHI '03.

[16]  William Hyde Wollaston,et al.  XIII. On the apparent direction of eyes in a portrait , 1824, Philosophical Transactions of the Royal Society of London.

[17]  Robin Wolff,et al.  Communicating Eye-gaze Across a Distance: Comparing an Eye-gaze enabled Immersive Collaborative Virtual Environment, Aligned Video Conferencing, and Being Together , 2009, 2009 IEEE Virtual Reality Conference.

[18]  John Rae,et al.  Simulation versus Reproduction for Avatar Eye-Gaze in Immersive Collaborative Virtual Environments , 2008 .

[19]  William Arthur Hugh Steptoe,et al.  Eye tracking and avatar-mediated communication in immersive collaborative virtual environments , 2010 .

[20]  Rob Aspin,et al.  A GPU based, projective multi-texturing approach to reconstructing the 3D human form for application in tele-presence , 2011, CSCW '11.

[21]  David J. Roberts,et al.  Accelerated polyhedral visual hulls using OpenCL , 2011, 2011 IEEE Virtual Reality Conference.

[22]  Edmond Boyer,et al.  Exact polyhedral visual hulls , 2003, BMVC.

[23]  Ketan Mayer-Patel,et al.  Real-time compression for dynamic 3D environments , 2003, MULTIMEDIA '03.

[24]  Gene H. Lerner Selecting next speaker: The context-sensitive operation of a context-free organization , 2003, Language in Society.

[25]  Milton Chen,et al.  Leveraging the asymmetric sensitivity of eye contact for videoconference , 2002, CHI.

[26]  Norman Murray,et al.  Comparison of head gaze and head and eye gaze within an immersive environment , 2006, 2006 Tenth IEEE International Symposium on Distributed Simulation and Real-Time Applications.

[27]  S. Langton,et al.  The influence of head contour and nose angle on the perception of eye-gaze direction , 2004, Perception & psychophysics.

[28]  S. Langton The Mutual Influence of Gaze and Head Orientation in the Analysis of Social Attention Direction , 2000, The Quarterly journal of experimental psychology. A, Human experimental psychology.