A gesture- and head-based multimodal interaction platform for MR remote collaboration

In this paper, we present a projector-based mixed reality (MR) remote collaborative system which enables remote users to collaboratively work on a physical task using gesture and head pointing (GHP). Using this platform, we studied the effects of GHP in a typical manufacturing use case. Our system supports natural and intuitive multimodal interaction based on GHP, and it can project the remote user’s GHP into the local environment to enhance remote collaboration. Our prototype system was compared with an augmented reality (AR) condition (ANNOTATION), which is the most popular method currently for AR/MR remote collaboration. We found a significant difference between the ANNOTATION and GHP conditions in terms of performance. The GHP system significantly improved the collaborative experience (e.g., awareness of the user’s attention), empathy (e.g., co-presence), and remote interaction. Moreover, we discuss the implications of this research and directions for future research.

[1]  Robert E. Kraut,et al.  Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks , 2003, CHI '03.

[2]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[3]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[4]  Susan R. Fussell,et al.  Assessing the value of a cursor pointing device for remote collaboration on physical tasks , 2003, CHI Extended Abstracts.

[5]  Darren Gergle,et al.  An Eye For Design: Gaze Visualizations for Remote Collaborative Work , 2018, CHI.

[6]  Tom Rodden,et al.  Ways of the Hands , 2005, ECSCW.

[7]  Yue Wang,et al.  Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system , 2018, The International Journal of Advanced Manufacturing Technology.

[8]  Andrew Y. C. Nee,et al.  Augmented reality applications in design and manufacturing , 2012 .

[9]  Leila Alem,et al.  A Study of Gestures in a Video-Mediated Collaborative Assembly Task , 2011, Adv. Hum. Comput. Interact..

[10]  Iker Aguinaga,et al.  A framework for augmented reality guidance in industry , 2019, The International Journal of Advanced Manufacturing Technology.

[11]  Andrew Y. C. Nee,et al.  Augmented reality applications in manufacturing: a survey , 2008 .

[12]  Poika Isokoski,et al.  GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks , 2016, CHI Extended Abstracts.

[13]  Mark Billinghurst,et al.  The Effect of View Independence in a Collaborative AR System , 2015, Computer Supported Cooperative Work (CSCW).

[14]  Darren Gergle,et al.  Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration , 2016, CHI.

[15]  Keita Higuchi,et al.  Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks , 2016, CHI.

[16]  Jacki O'Neill,et al.  From ethnographic study to mixed reality: a remote collaborative troubleshooting system , 2011, CSCW.

[17]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[18]  Abhishek Ranjan,et al.  Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task , 2007, CHI.

[19]  Mark Billinghurst,et al.  [POSTER] Mutually Shared Gaze in Augmented Video Conference , 2017, 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[20]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[21]  Benjamin Cohen,et al.  TeleAdvisor: a versatile augmented reality tool for remote assistance , 2012, CHI.

[22]  F. Biocca,et al.  Internal Consistency and Reliability of the Networked MindsMeasure of Social Presence , 2004 .

[23]  Mark Billinghurst,et al.  Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[24]  Henry Been-Lirn Duh,et al.  Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration , 2017, Journal on Multimodal User Interfaces.

[25]  Mark Billinghurst,et al.  Do You Know What I Mean? An MR-Based Collaborative Platform , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[26]  Peng Wang,et al.  2.5DHANDS: a gesture-based MR remote collaborative platform , 2019, The International Journal of Advanced Manufacturing Technology.

[27]  Weidong Huang,et al.  3D helping hands: a gesture based MR system for remote collaboration , 2012, VRCAI '12.

[28]  Xu Zhang,et al.  A comprehensive survey of AR/MR-based co-design in manufacturing , 2019, Engineering with Computers.

[29]  Oscar Meruvia Pastor,et al.  Augmented Reality as a Telemedicine Platform for Remote Procedural Training , 2017, Sensors.

[30]  Xilin Chen,et al.  DOVE: drawing over video environment , 2003, MULTIMEDIA '03.

[31]  Weidong Huang,et al.  HandsinAir: a wearable system for remote collaboration on physical tasks , 2013, CSCW '13.

[32]  Lei Gao,et al.  Static local environment capturing and sharing for MR remote collaboration , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[33]  Benjamin Cohen,et al.  Design and Implementation of TeleAdvisor: a Projection-Based Augmented Reality System for Remote Collaboration , 2015, Computer Supported Cooperative Work (CSCW).

[34]  Scott Bateman,et al.  Stabilized Annotations for Mobile Remote Assistance , 2016, CHI.

[35]  Ruzena Bajcsy,et al.  User experience and interaction performance in 2D/3D telecollaboration , 2018, Future Gener. Comput. Syst..

[36]  Christopher A. Dickinson,et al.  Coordinating cognition: The costs and benefits of shared gaze during collaborative search , 2008, Cognition.

[37]  Weiping He,et al.  Head Pointer or Eye Gaze: Which Helps More in MR Remote Collaboration? , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[38]  Hideaki Kuzuoka,et al.  Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks , 2018, CHI.

[39]  Sean Andrist,et al.  Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters , 2017, CHI.

[40]  Allen Y. Yang,et al.  Augmented Telemedicine Platform for Real-Time Remote Medical Consultation , 2017, MMM.

[41]  Nan Jiang,et al.  Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation , 2017, Int. J. Hum. Comput. Stud..

[42]  Leila Alem,et al.  Exploring interface with representation of gesture for remote collaboration , 2007, OZCHI '07.

[43]  David S. Kirk,et al.  Comparing remote gesture technologies for supporting collaborative physical tasks , 2006, CHI.

[44]  Susan R. Fussell,et al.  Effects of task properties, partner actions, and message content on eye gaze patterns in a collaborative task , 2005, CHI.