A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing

Supporting natural communication cues is critical for people to work together remotely and face-to-face. In this paper we present a Mixed Reality (MR) remote collaboration system that enables a local worker to share a live 3D panorama of his/her surroundings with a remote expert. The remote expert can also share task instructions back to the local worker using visual cues in addition to verbal communication. We conducted a user study to investigate how sharing augmented gaze and gesture cues from the remote expert to the local worker could affect the overall collaboration performance and user experience. We found that by combing gaze and gesture cues, our remote collaboration system could provide a significantly stronger sense of co-presence for both the local and remote users than using the gaze cue alone. The combined cues were also rated significantly higher than the gaze in terms of ease of conveying spatial actions.

[1]  Ilona Heldal,et al.  Interaction Between Users of Immersion Projection Technology Systems , 2005 .

[2]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[3]  Tobias Höllerer,et al.  World-stabilized annotations and virtual scene navigation for remote collaboration , 2014, UIST.

[4]  Lei Gao,et al.  Static local environment capturing and sharing for MR remote collaboration , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[5]  Andreas Bulling,et al.  On the interplay between spontaneous spoken instructions and human visual behaviour in an indoor guidance task , 2015, CogSci.

[6]  Abigail Sellen,et al.  One is not enough: multiple views in a media space , 1993, INTERCHI.

[7]  Anthony Steed,et al.  Effects of 3D perspective on head gaze estimation with a multiview autostereoscopic display , 2016, Int. J. Hum. Comput. Stud..

[8]  Tobias Höllerer,et al.  Integrating the physical environment into mobile remote collaboration , 2012, Mobile HCI.

[9]  Mark Billinghurst,et al.  Mixed reality collaboration through sharing a live panorama , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[10]  Boris M. Velichkovsky,et al.  Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to? , 2013, Quarterly journal of experimental psychology.

[11]  Hideaki Kuzuoka,et al.  Remote collaboration using a shoulder-worn active camera/laser , 2004, Eighth International Symposium on Wearable Computers.

[12]  David A. Forsyth,et al.  BeThere: 3D mobile collaboration with spatial input , 2013, CHI.

[13]  Weidong Huang,et al.  3D helping hands: a gesture based MR system for remote collaboration , 2012, VRCAI '12.

[14]  Mark Billinghurst,et al.  Sharedsphere: MR collaboration through shared live panorama , 2017, SIGGRAPH ASIA Emerging Technologies.

[15]  Henry Fuchs,et al.  Immersive 3D Telepresence , 2014, Computer.

[16]  Lei Gao,et al.  An oriented point-cloud view for MR remote collaboration , 2016, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[17]  Bruce H. Thomas,et al.  On the Shoulder of the Giant: A Multi-Scale Mixed Reality Collaboration with 360 Video Sharing and Tangible Interaction , 2019, CHI.

[18]  Lei Gao,et al.  Real-time visual representations for mobile mixed reality remote collaboration , 2018, SIGGRAPH 2018.

[19]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[20]  Bilge Mutlu,et al.  Handheld or Handsfree?: Remote Collaboration via Lightweight Head-Mounted Displays and Handheld Devices , 2015, CSCW.

[21]  Mark Billinghurst,et al.  Improving co-presence with augmented visual communication cues for sharing experience through video conference , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[22]  Arindam Dey,et al.  [POSTER] CoVAR: Mixed-Platform Remote Collaborative Augmented and Virtual Realities System with Shared Collaboration Cues , 2017, 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[23]  Stuart Anderson,et al.  RemoteFusion: real time depth camera fusion for remote collaboration on physical tasks , 2013, VRCAI '13.

[24]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[25]  Leila Alem,et al.  A Study of Gestures in a Video-Mediated Collaborative Assembly Task , 2011, Adv. Hum. Comput. Interact..

[26]  Mark Billinghurst,et al.  The Effect of View Independence in a Collaborative AR System , 2015, Computer Supported Cooperative Work (CSCW).

[27]  Gerd Kortuem,et al.  "Where are you pointing at?" A study of remote collaboration in a wearable videoconference system , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[28]  David S. Kirk,et al.  Comparing remote gesture technologies for supporting collaborative physical tasks , 2006, CHI.

[29]  Marc Pollefeys,et al.  A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[30]  Kai Kunze,et al.  Empathy Glasses , 2016, CHI Extended Abstracts.

[31]  H. Fuchs,et al.  A First Look at a Telepresence System with Room-Sized Real-Time 3 D Capture and Life-Sized Tracked Display Wall , 2011 .

[32]  Robert E. Kraut,et al.  Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks , 2003, CHI '03.

[33]  Michael Weinmann,et al.  SLAMCast: Large-Scale, Real-Time 3D Reconstruction and Streaming for Immersive Multi-Client Live Telepresence , 2018, IEEE Transactions on Visualization and Computer Graphics.

[34]  Robert E. Kraut,et al.  Coordination of communication: effects of shared visual context on collaborative work , 2000, CSCW '00.

[35]  Mark Billinghurst,et al.  Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[36]  Mark Billinghurst,et al.  A User Study on MR Remote Collaboration Using Live 360 Video , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[37]  Bruno Feijó,et al.  Real time 360° video stitching and streaming , 2016, SIGGRAPH Posters.

[38]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[39]  J. Helmert,et al.  Limitations of gaze transfer: without visual context, eye movements do not to help to coordinate joint action, whereas mouse movements do. , 2014, Acta psychologica.

[40]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[41]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[42]  Jun Rekimoto,et al.  LiveSphere: immersive experience sharing with 360 degrees head-mounted cameras , 2014, UIST.

[43]  Benjamin Cohen,et al.  TeleAdvisor: a versatile augmented reality tool for remote assistance , 2012, CHI.

[44]  Mark Billinghurst,et al.  Study of augmented gesture communication cues and view sharing in remote collaboration , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[45]  Matt Adcock,et al.  Mixed Reality Remote Collaboration Combining 360 Video and 3D Reconstruction , 2019, CHI.