Loki: Facilitating Remote Instruction of Physical Tasks Using Bi-Directional Mixed-Reality Telepresence

Remotely instructing and guiding users in physical tasks has offered promise across a wide variety of domains. While it has been the subject of many research projects, current approaches are often limited in the communication bandwidth (lacking context, spatial information) or interactivity (unidirectional, asynchronous) between the expert and the learner. Systems that use Mixed-Reality systems for this purpose have rigid configurations for the expert and the learner. We explore the design space of bi-directional mixed-reality telepresence systems for teaching physical tasks, and present Loki, a novel system which explores the various dimensions of this space. Loki leverages video, audio and spatial capture along with mixed-reality presentation methods to allow users to explore and annotate the local and remote environments, and record and review their own performance as well as their peer's. The system design of Loki also enables easy transitions between different configurations within the explored design space. We validate its utility through a varied set of scenarios and a qualitative user study.

[1]  David Lindlbauer,et al.  Remixed Reality: Manipulating Space and Time in Augmented Reality , 2018, CHI.

[2]  Patrick Baudisch,et al.  Focus plus context screens: combining display technology with visualization techniques , 2001, UIST '01.

[3]  Susan E. Newman,et al.  Cognitive Apprenticeship: Teaching the Craft of Reading, Writing, and Mathematics. Technical Report No. 403. , 1987 .

[4]  Koji Tsukada,et al.  Support System to Review Manufacturing Workshop through Multiple Videos , 2018, IUI Companion.

[5]  Nigel W. John,et al.  A review of virtual environments for training in ball sports , 2012, Comput. Graph..

[6]  Hiroshi Ishii,et al.  ClearBoard: a seamless medium for shared drawing and conversation with eye contact , 1992, CHI.

[7]  Rubaiat Habib Kazi,et al.  Draco: bringing life to illustrations with kinetic textures , 2014, CHI.

[8]  Dirk Heylen,et al.  With a little help from a holographic friend: the OpenIMPRESS mixed reality telepresence toolkit for remote collaboration systems , 2018, VRST.

[9]  Marlies P Schijven,et al.  ProMIS Augmented Reality Training of Laparoscopic Procedures Face Validity , 2008, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[10]  Henry Fuchs,et al.  Immersive 3D Telepresence , 2014, Computer.

[11]  Naokazu Yokoya,et al.  Free-viewpoint AR human-motion reenactment based on a single RGB-D video stream , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[12]  Rubaiat Habib Kazi,et al.  Kitty: sketching dynamic and interactive illustrations , 2014, UIST.

[13]  Hrvoje Benko,et al.  LightGuide: projected visualizations for hand movement guidance , 2012, CHI.

[14]  Charles T. Loop,et al.  Holoportation: Virtual 3D Teleportation in Real-time , 2016, UIST.

[15]  Robert E. Kraut,et al.  Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks , 2003, CHI '03.

[16]  Gerd Kortuem,et al.  A collaborative wearable system with remote sensing , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[17]  R. Riener,et al.  Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review , 2012, Psychonomic Bulletin & Review.

[18]  Peter Wolf,et al.  Visual and Auditory Augmented Concurrent Feedback in a Complex Motor Task , 2011, Presence Teleoperators Virtual Environ..

[19]  Blair MacIntyre,et al.  RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units , 2014, UIST.

[20]  Abhishek Ranjan,et al.  Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task , 2007, CHI.

[21]  Joan N. Vickers,et al.  Instructional Design for Teaching Physical Activities: A Knowledge Structures Approach. , 1990 .

[22]  Maximilian Speicher,et al.  360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality , 2018, Proc. ACM Hum. Comput. Interact..

[23]  Hiroshi Ishii,et al.  Iterative design of seamless collaboration media , 1994, CACM.

[24]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[25]  Kouta Minamizawa,et al.  Development of Mutual Telexistence System using Virtual Projection of Operator's Egocentric Body Images , 2015, ICAT-EGVE.

[26]  R. Reznick,et al.  Teaching and testing technical skills. , 1993, American journal of surgery.

[27]  Zhenyi He,et al.  Chalktalk : A Visualization and Communication Language - As a Tool in the Domain of Computer Science Education , 2018, ArXiv.

[28]  T. Grantcharov,et al.  Randomized clinical trial of virtual reality simulation for laparoscopic skills training , 2004, The British journal of surgery.

[29]  Hiroshi Ishii TeamWorkStation: towards a seamless shared workspace , 1990, CSCW '90.

[30]  Steven K. Feiner,et al.  Augmented Reality for Maintenance and Repair (ARMAR) , 2007 .

[31]  Yongtian Wang,et al.  Employing Different Viewpoints for Remote Guidance in a Collaborative Augmented Environment , 2018, CCHI.

[32]  Dieter Schmalstieg,et al.  “Studierstube”: An environment for collaboration in augmented reality , 1998, Virtual Reality.

[33]  C. Shea,et al.  Motor skill learning and performance: a review of influential factors , 2010, Medical education.

[34]  Jun Rekimoto,et al.  JackIn: integrating first-person view with out-of-body vision generation for human-human augmentation , 2014, AH.

[35]  Stefan Kopp,et al.  A Multimodal System for Real-Time Action Instruction in Motor Skill Learning , 2015, ICMI.

[36]  Xing-Dong Yang,et al.  Physio@Home: Exploring Visual Guidance and Feedback Techniques for Physiotherapy Exercises , 2015, CHI.

[37]  Hans-Werner Gellersen,et al.  MotionMA: motion modelling and analysis by demonstration , 2013, CHI.

[38]  Robert W. Lindeman,et al.  Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration , 2018, CHI.

[39]  Sher ry Folsom-Meek,et al.  Human Performance , 1953, Nature.

[40]  L K Tennant,et al.  Maximizing performance feedback effectiveness through videotape replay and a self-controlled learning environment. , 1997, Research quarterly for exercise and sport.

[41]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[42]  Steven K. Feiner,et al.  Augmented reality in the psychomotor phase of a procedural task , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[43]  S. Kopp,et al.  Timing and Grounding in Motor Skill Coaching Interaction: Consequences for the Information State , 2015 .

[44]  Stephen DiVerdi,et al.  TutoriVR: A Video-Based Tutorial System for Design Applications in Virtual Reality , 2019, CHI.

[45]  S. Botden,et al.  What is going on in augmented reality simulation in laparoscopic surgery? , 2008, Surgical Endoscopy.

[46]  Johan Gustav Bellika,et al.  An Evaluation Framework for Defining the Contributions of Telestration in Surgical Telementoring , 2013, Interactive journal of medical research.

[47]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time dynamic 3D surface reconstruction and interaction , 2011, SIGGRAPH '11.

[48]  Matt Adcock,et al.  Annotating with light for remote guidance , 2007, OZCHI '07.

[49]  Tovi Grossman,et al.  YouMove: enhancing movement training with an augmented reality mirror , 2013, UIST.

[50]  Tomoo Inoue,et al.  Motion Adaptive Orientation Adjustment of a Virtual Teacher to Support Physical Task Learning , 2012, J. Inf. Process..

[51]  D. Kolb Experiential Learning: Experience as the Source of Learning and Development , 1983 .

[52]  Roger Kneebone,et al.  Practical skills teaching in contemporary surgical education: how can educational theory be applied to promote effective learning? , 2012, American journal of surgery.