Evaluating an augmented remote assistance platform to support industrial applications

Remote assistance provides a communication bridge for users engaged in different locations. However, understanding how to design such systems in IoT is a challenging issue given digital representations are not the same as sharing a physical space. In this paper, we present a Remote Assistance Platform (RAP) that is designed to facilitate task guidance between an instructor and one or more remote operators. This includes the support of visual communication using annotation tools that augment information from a live video stream. Two user studies were performed to evaluate co-located and remote interaction. In the first study, dyads interacted with paper-based instructions while situated in the same location. In the second study, different dyads remotely performed the same tasks, assisted by using a smartphone or smart glass display. Overall, our findings found significant differences in communication behaviour based on the type of collaborative environment and information modality used. A short review of these results is discussed.

[1]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..

[2]  Matt Adcock,et al.  Annotating with light for remote guidance , 2007, OZCHI '07.

[3]  Niels Henze,et al.  Remote Assistance for Blind Users in Daily Life: A Survey about Be My Eyes , 2016, PETRA.

[4]  Designing the Social Internet of Things , 2017, CHI Extended Abstracts.

[5]  Joo-Hwee Lim,et al.  Exploring the Use of Visual Annotations in a Remote Assistance Platform , 2016, CHI Extended Abstracts.

[6]  Christopher D. Wickens,et al.  The Structure of Attentional Resources , 1980 .

[7]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[8]  Didier Stricker,et al.  Augmented reality based on edge computing using the example of remote live support , 2017, 2017 IEEE International Conference on Industrial Technology (ICIT).

[9]  Hirokazu Kato,et al.  Remote assistance using visual prompts for demented elderly in cooking , 2011, ISABEL '11.

[10]  Mark Rice,et al.  Preliminary investigation of augmented intelligence for remote assistance using a wearable display , 2017, TENCON 2017 - 2017 IEEE Region 10 Conference.

[11]  Thomas Schack,et al.  ADAMAAS: Towards Smart Glasses for Mobile and Personalized Action Assistance , 2016, PETRA.

[12]  Mark Billinghurst,et al.  Improving co-presence with augmented visual communication cues for sharing experience through video conference , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[13]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[14]  Scott Bateman,et al.  Stabilized Annotations for Mobile Remote Assistance , 2016, CHI.

[15]  Marcus Englund,et al.  Designing a Remote Video Collaboration System for Industrial Settings , 2014, ITS '14.

[16]  Dima Damen,et al.  You-Do, I-Learn: Egocentric unsupervised discovery of objects and their modes of interaction towards video-based guidance , 2016, Comput. Vis. Image Underst..

[17]  Bilge Mutlu,et al.  Handheld or Handsfree?: Remote Collaboration via Lightweight Head-Mounted Displays and Handheld Devices , 2015, CSCW.

[18]  Fridolin Wild,et al.  Bridging the Skills Gap of Workers in Industry 4.0 by Human Performance Augmentation Tools: Challenges and Roadmap , 2017, PETRA.

[19]  John C. Tang,et al.  Videodraw: a video interface for collaborative drawing , 1991, TOIS.