Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration

We present results from research exploring the effect of sharing virtual gaze and pointing cues in a wearable interface for remote collaboration. A local worker wears a Head-mounted Camera, Eye-tracking camera and a Head-Mounted Display and shares video and virtual gaze information with a remote helper. The remote helper can provide feedback using a virtual pointer on the live video view. The prototype system was evaluated with a formal user study. Comparing four conditions, (1) NONE (no cue), (2) POINTER, (3) EYE-TRACKER and (4) BOTH (both pointer and eye-tracker cues), we observed that the task completion performance was best in the BOTH condition with a significant difference of POINTER and EYETRACKER individually. The use of eye-tracking and a pointer also significantly improved the co-presence felt between the users. We discuss the implications of this research and the limitations of the developed system that could be improved in further work.

[1]  Nicholas A. Giovinco,et al.  A Heads-Up Display for Diabetic Limb Salvage Surgery , 2014, Journal of diabetes science and technology.

[2]  Hideaki Kuzuoka,et al.  GestureCam: a video communication system for sympathetic remote collaboration , 1994, CSCW '94.

[3]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[4]  Susan R. Fussell,et al.  Using Eye-Tracking Techniques to Study Collaboration on Physical Tasks: Implications for Medical Research , 2003 .

[5]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[6]  Darren Gergle,et al.  Designing Shared Gaze Awareness for Remote Collaboration , 2016, CSCW Companion.

[7]  Boris M. Velichkovsky,et al.  Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to? , 2013, Quarterly journal of experimental psychology.

[8]  Susan R. Fussell,et al.  Where do helpers look?: gaze targets during collaborative physical tasks , 2003, CHI Extended Abstracts.

[9]  Robert E. Kraut,et al.  Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks , 2003, CHI '03.

[10]  Christopher A. Dickinson,et al.  Coordinating spatial referencing using shared gaze , 2010, Psychonomic bulletin & review.

[11]  Hideaki Kuzuoka,et al.  Spatial workspace collaboration: a SharedView video support system for remote collaboration capability , 1992, CHI.

[12]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[13]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[14]  Christopher A. Dickinson,et al.  Coordinating cognition: The costs and benefits of shared gaze during collaborative search , 2008, Cognition.

[15]  Jean Carletta,et al.  Eyetracking for two-person tasks with manipulation of a virtual world , 2010, Behavior research methods.

[16]  Mark Billinghurst,et al.  Improving co-presence with augmented visual communication cues for sharing experience through video conference , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[17]  Hideaki Kuzuoka,et al.  Remote collaboration using a shoulder-worn active camera/laser , 2004, Eighth International Symposium on Wearable Computers.

[18]  Susan R. Fussell,et al.  Predicting Visual Focus of Attention From Intention in Remote Collaborative Tasks , 2008, IEEE Transactions on Multimedia.

[19]  Hirokazu Kato,et al.  Collaborative augmented reality , 2002, CACM.

[20]  Robert E. Kraut,et al.  Coordination of communication: effects of shared visual context on collaborative work , 2000, CSCW '00.

[21]  Carolyn Penstein Rosé,et al.  Sharing a single expert among multiple partners , 2007, CHI.

[22]  Susan R. Fussell,et al.  Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task , 2005, CHI.

[23]  Susan Brennan,et al.  Another person's eye gaze as a cue in solving programming problems , 2004, ICMI '04.

[24]  Kai Kunze,et al.  Empathy Glasses , 2016, CHI Extended Abstracts.

[25]  Gerd Kortuem,et al.  "Where are you pointing at?" A study of remote collaboration in a wearable videoconference system , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[26]  Hideaki Kuzuoka,et al.  WACL: supporting telecommunications using - wearable active camera with laser pointer , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[27]  Seungwon Kim,et al.  Comparing pointing and drawing for remote collaboration , 2013, ISMAR.

[28]  Robert E. Kraut,et al.  An empirical study of collaborative wearable computer systems , 1995, CHI '95.

[29]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[30]  Hiroshi Ishii,et al.  Iterative design of seamless collaboration media , 1994, CACM.

[31]  Tobias Höllerer,et al.  In touch with the remote world: remote collaboration with augmented reality drawings and virtual navigation , 2014, VRST '14.

[32]  Abigail Sellen,et al.  One is not enough: multiple views in a media space , 1993, INTERCHI.

[33]  Cara A. Stitzlein,et al.  Gaze analysis in a remote collaborative setting , 2006, OZCHI.