EGuide: Investigating different Visual Appearances and Guidance Techniques for Egocentric Guidance Visualizations

Mid-air arm movements are important for various activities. However, common resources for their self-directed practice require practitioners to divide their focus between an external source (e.g., a video screen) and moving. Past research found benefits for egocentric guidance visualizations compared to common resources. However, there is limited evidence about how such visualizations should look and behave. EGuide supports the investigation of different egocentric visualizations for the guidance of mid-air arm movements. We compared two visual appearances for egocentric guidance visualizations that differ in their shape (look), and three guidance techniques that differ by how they guide a user (behavior). For visualizations with a continuously moving guidance technique, our results suggest a higher movement accuracy for a realistic than an abstract shape. For user experience and preference, our results suggest that visualizations with an abstract shape and a guidance technique that visualizes important postures should not pause at important postures.

[1]  Frank Steinicke,et al.  Stylo and handifact: modulating haptic perception through visualizations for posture training in augmented reality , 2017, SUI.

[2]  Michael Rohs,et al.  Emotion Actuator: Embodied Emotional Feedback through Electroencephalography and Electrical Muscle Stimulation , 2017, CHI.

[3]  Daniel Medeiros,et al.  SleeveAR: Augmented Reality for Rehabilitation using Realtime Feedback , 2016, IUI.

[4]  Antonio Krüger,et al.  ClimbVis: Investigating In-situ Visualizations for Understanding Climbing Movements by Demonstration , 2017, ISS.

[5]  Tovi Grossman,et al.  YouMove: enhancing movement training with an augmented reality mirror , 2013, UIST.

[6]  Niels Henze,et al.  "These are not my hands!": Effect of Gender on the Perception of Avatar Hands in Virtual Reality , 2017, CHI.

[7]  Gerard Jounghyun Kim,et al.  Implementation and Evaluation of Just Follow Me: An Immersive, VR-Based, Motion-Training System , 2002, Presence: Teleoperators & Virtual Environments.

[8]  Taku Komura,et al.  A Virtual Reality Dance Training System Using Motion Capture Technology , 2011, IEEE Transactions on Learning Technologies.

[9]  Hrvoje Benko,et al.  LightGuide: projected visualizations for hand movement guidance , 2012, CHI.

[10]  Jun Rekimoto,et al.  PossessedHand: a hand gesture manipulation system using electrical stimuli , 2010, AH.

[11]  Xing-Dong Yang,et al.  Physio@Home: Exploring Visual Guidance and Feedback Techniques for Physiotherapy Exercises , 2015, CHI.

[12]  Joseph B. Kruskall,et al.  The Symmetric Time-Warping Problem : From Continuous to Discrete , 1983 .

[13]  Hans-Werner Gellersen,et al.  MotionMA: motion modelling and analysis by demonstration , 2013, CHI.

[14]  Ling Guan,et al.  An Approach to Ballet Dance Training through MS Kinect and Visualization in a CAVE Virtual Reality Environment , 2015, ACM Trans. Intell. Syst. Technol..

[15]  Xiaoming Chen,et al.  ImmerTai: Immersive Motion Learning in VR Environments , 2019, J. Vis. Commun. Image Represent..

[16]  Jan O. Borchers,et al.  Tactile motion instructions for physical activities , 2009, CHI.

[17]  Martin Schrepp,et al.  Construction and Evaluation of a User Experience Questionnaire , 2008, USAB.

[18]  Robert Rowe,et al.  Ghostman: Augmented Reality Application for Telerehabilitation and Remote Instruction of a Novel Motor Skill , 2014, BioMed research international.

[19]  Ning Hu,et al.  Training for physical tasks in virtual environments: Tai Chi , 2003, IEEE Virtual Reality, 2003. Proceedings..

[20]  Hideo Saito,et al.  Support system for guitar playing using augmented reality display , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[21]  Timothy D. Lee,et al.  Motor Control and Learning: A Behavioral Emphasis , 1982 .

[22]  Olivier Chapuis,et al.  SoundGuides: Adapting Continuous Auditory Feedback to Users , 2016, CHI Extended Abstracts.

[23]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[24]  Niels Henze,et al.  "Where's Pinky?": The Effects of a Reduced Number of Fingers in Virtual Reality , 2017, CHI PLAY.

[25]  Niels Henze,et al.  Physical Keyboards in Virtual Reality: Analysis of Typing Performance and Effects of Avatar Hands , 2018, CHI.

[26]  Frank Vetere,et al.  Onebody: Remote Posture Guidance System using First Person View in Virtual Environment , 2016, NordiCHI.

[27]  Yi-Ping Hung,et al.  AR-Arm: Augmented Visualization for Guiding Arm Movement in the First-Person Perspective , 2016, AH.