Augmented reality-based remote coaching for fast-paced physical task

One popular application of augmented reality (AR) is the real-time guidance and training in which the AR user receives useful information by a remote expert. For relatively fast-paced tasks, presentation of such guidance in a way that the recipient can make immediate recognition and quick understanding can be an especially challenging problem. In this paper, we present an AR-based tele-coaching system applied to the game of tennis, called the AR coach, and explore for interface design guidelines through a user study. We have evaluated the player’s performance for instruction understanding when the coaching instruction was presented in four different modalities: (1) Visual—visual only, (2) Sound—aural only/mono, (3) 3D Sound—aural only/3D and (4) Multimodal—both visual and aural/mono. Results from the experiment suggested that, among the three, the visual-only augmentation was the most effective and least distracting for the given pace of information transfer (e.g., under every 3 s). We attribute such a result to the characteristic of the visual modality to encode and present a lot of information at once and the human’s limited capability in handling and fusing multimodal information at a relatively fast rate.

[1]  John R. Anderson,et al.  Serial modules in parallel: the psychological refractory period and perfect time-sharing. , 2001, Psychological review.

[2]  D. Damos Multiple-task performance , 2020 .

[3]  David S. Ebert,et al.  The integrality of speech in multimodal interfaces , 1998, TCHI.

[4]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[5]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[6]  D. Strayer,et al.  Multi-Tasking in the Automobile , 2005 .

[7]  Colin Ware,et al.  Information Visualization: Perception for Design , 2000 .

[8]  Myounghoon Jeon,et al.  Ergonomics Society of the Human Factors and Human Factors: The Journal , 2012 .

[9]  Niels Taatgen,et al.  Toward a unified theory of the multitasking continuum: from concurrent performance to task switching, interruption, and resumption , 2009, CHI.

[10]  Perttu Hämäläinen,et al.  The Augmented Climbing Wall: High-Exertion Proximity Interaction on a Wall-Sized Interactive Surface , 2016, CHI.

[11]  Bernd Schwald,et al.  An Augmented Reality System for Training and Assistence to Maintenance in the Industrial Context , 2003, WSCG.

[12]  Gudrun Klinker,et al.  Survey and Classification of Head-Up Display Presentation Principles , 2009 .

[13]  Tapio Lokki,et al.  Some aspects of role of audio in immersive visualization , 2001, IS&T/SPIE Electronic Imaging.

[14]  Federico Manuri,et al.  A Survey on Applications of Augmented Reality , 2016 .

[15]  D. Meyer,et al.  Executive control of cognitive processes in task switching. , 2001, Journal of experimental psychology. Human perception and performance.

[16]  Anatole Lécuyer,et al.  HOMERE: a multimodal system for visually impaired people to explore virtual environments , 2003, IEEE Virtual Reality, 2003. Proceedings..

[17]  Alois Ferscha,et al.  Augmented reality navigation systems , 2006, Universal Access in the Information Society.

[18]  Alexander G. Hauptmann,et al.  Gestures with Speech for Graphic Manipulation , 1993, Int. J. Man Mach. Stud..

[19]  Miriam Reiner,et al.  Multimodal Virtual Environments: Response Times, Attention, and Presence , 2006, PRESENCE: Teleoperators and Virtual Environments.

[20]  Christian Sandor,et al.  Improving Spatial Perception for Augmented Reality X-Ray Vision , 2009, 2009 IEEE Virtual Reality Conference.

[21]  Ramesh Raskar,et al.  Augmented Reality Visualization for Laparoscopic Surgery , 1998, MICCAI.

[22]  Jonas Tappolet,et al.  Multimodal Interfaces , 2006, Encyclopedia of Multimedia.

[23]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[24]  N. Taatgen,et al.  The problem state: a cognitive bottleneck in multitasking. , 2010, Journal of experimental psychology. Learning, memory, and cognition.

[25]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[26]  Tao Yang,et al.  Eye-Wearable Technology for Machine Maintenance: Effects of Display Position and Hands-free Operation , 2015, CHI.

[27]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[28]  Dieter Schmalstieg,et al.  Dynamic compact visualizations for augmented reality , 2013, 2013 IEEE Virtual Reality (VR).

[29]  Nassir Navab,et al.  Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures , 2007, IEEE Computer Graphics and Applications.

[30]  Kirsten Rassmus-Gröhn,et al.  Supporting presence in collaborative environments by haptic force feedback , 2000, TCHI.