A User Study on MR Remote Collaboration Using Live 360 Video

Sharing and watching live 360 panorama video is available on modern social networking platforms, yet the communication is often a passive one-directional experience. This research investigates how to further improve live 360 panorama based remote collaborative experiences by adding Mixed Reality (MR) cues. SharedSphere is a wearable MR remote collaboration system that enriches a live captured immersive panorama based collaboration through MR visualisation of non-verbal communication cues (e.g., view awareness and gestures cues). We describe the design and implementation details of the prototype system, and report on a user study investigating how MR live panorama sharing affects the user's collaborative experience. The results showed that providing view independence through sharing live panorama enhances co-presence in collaboration, and the MR cues help users understanding each other. Based on the study results we discuss design implications and future research direction.

[1]  Lei Gao,et al.  An oriented point-cloud view for MR remote collaboration , 2016, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[2]  Joseph S. Dumas,et al.  Comparison of three one-question, post-task usability questionnaires , 2009, CHI.

[3]  Robert W. Lindeman,et al.  Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration , 2018, CHI.

[4]  Keita Higuchi,et al.  Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks , 2016, CHI.

[5]  Mark Billinghurst,et al.  Social panoramas: using wearable computers to share experiences , 2014, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[6]  Tobias Höllerer,et al.  Interpreting 2D gesture annotations in 3D augmented reality , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[7]  Robert S. Kennedy,et al.  Simulator Sickness Questionnaire: An enhanced method for quantifying simulator sickness. , 1993 .

[8]  Jiro Tanaka,et al.  Gesture-based Mobile Communication System Providing Side-by-side Shopping Feeling , 2018, IUI Companion.

[9]  Lei Gao,et al.  Real-time Visual Representations for Mixed Reality Remote Collaboration , 2017, ICAT-EGVE.

[10]  Tobias Höllerer,et al.  World-stabilized annotations and virtual scene navigation for remote collaboration , 2014, UIST.

[11]  Thierry Duval,et al.  Vishnu: virtual immersive support for HelpiNg users an interaction paradigm for collaborative remote guiding in mixed reality , 2016, 2016 IEEE Third VR International Workshop on Collaborative Virtual Environments (3DCVE).

[12]  Weidong Huang,et al.  3D helping hands: a gesture based MR system for remote collaboration , 2012, VRCAI '12.

[13]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[14]  Jens Herder,et al.  Applying rotational tracking and photospherical imagery to immersive mobile telepresence and live video streaming groupware , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[15]  Carman Neustaedter,et al.  Collaboration with 360° Videochat: Challenges and Opportunities , 2017, Conference on Designing Interactive Systems.

[16]  F. Biocca,et al.  Internal Consistency and Reliability of the Networked MindsMeasure of Social Presence , 2004 .

[17]  Mark Billinghurst,et al.  Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[18]  Mark Billinghurst,et al.  The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration , 2018, Computer Supported Cooperative Work (CSCW).

[19]  Monica Bordegoni,et al.  Supporting Remote Maintenance in Industry 4.0 through Augmented Reality , 2017 .

[20]  Judith Amores,et al.  ShowMe: A Remote Collaboration System that Supports Immersive Gestural Communication , 2015, CHI Extended Abstracts.

[21]  Shenchang Eric Chen,et al.  QuickTime VR: an image-based approach to virtual environment navigation , 1995, SIGGRAPH.

[22]  Mark Billinghurst,et al.  Study of augmented gesture communication cues and view sharing in remote collaboration , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[23]  Tobias Höllerer,et al.  Integrating the physical environment into mobile remote collaboration , 2012, Mobile HCI.

[24]  Jun Rekimoto,et al.  JackIn Head: Immersive Visual Telepresence System with Omnidirectional Wearable Camera , 2017, IEEE Trans. Vis. Comput. Graph..

[25]  Holger Regenbrecht,et al.  PanoVC: Pervasive telepresence using mobile phones , 2016, 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[26]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[27]  Arindam Dey,et al.  Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze , 2017, ICAT-EGVE.

[28]  Tobias Höllerer,et al.  In touch with the remote world: remote collaboration with augmented reality drawings and virtual navigation , 2014, VRST '14.

[29]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[30]  David A. Forsyth,et al.  BeThere: 3D mobile collaboration with spatial input , 2013, CHI.

[31]  Mark Billinghurst,et al.  The Effect of View Independence in a Collaborative AR System , 2015, Computer Supported Cooperative Work (CSCW).

[32]  Robert W. Lindeman,et al.  Sharing Gaze for Remote Instruction , 2017, ICAT-EGVE.

[33]  Mark Billinghurst,et al.  [POSTER] Mutually Shared Gaze in Augmented Video Conference , 2017, 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[34]  Mark Billinghurst,et al.  Mixed reality collaboration through sharing a live panorama , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[35]  F.R.H. Zijlstra,et al.  Efficiency in work behaviour: A design approach for modern tools , 1993 .