In this study we present a Mixed-Reality based mobile remote collaboration system that enables an expert providing real-time assistance over a physical distance. By using the Google ARCore position tracking, we can integrate the keyframes captured with one external depth sensor attached to the mobile phone as one single 3D point-cloud data set to present the local physical environment into the VR world. This captured local scene is then wirelessly streamed to the remote side for the expert to view while wearing a mobile VR headset (HTC VIVE Focus). In this case, the remote expert can immerse himself/herself in the VR scene and provide guidance just as sharing the same work environment with the local worker. In addition, the remote guidance is also streamed back to the local side as an AR cue overlaid on top of the local video see-through display. Our proposed mobile remote collaboration system supports a pair of participants performing as one remote expert guiding one local worker on some physical tasks in a more natural and efficient way in a large scale work space from a distance by simulating the face-to-face co-work experience using the Mixed-Reality technique.
[1]
Leila Alem,et al.
A Study of Gestures in a Video-Mediated Collaborative Assembly Task
,
2011,
Adv. Hum. Comput. Interact..
[2]
Robert E. Kraut,et al.
Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks
,
2003,
CHI '03.
[3]
Gérard G. Medioni,et al.
Object modeling by registration of multiple range images
,
1991,
Proceedings. 1991 IEEE International Conference on Robotics and Automation.
[4]
Lei Gao,et al.
Real-time Visual Representations for Mixed Reality Remote Collaboration
,
2017,
ICAT-EGVE.