A quality framework for user interaction in virtual environments using wearable devices

Gesture recognition has been considered as one of the most effective input methods to interact with virtual environments (VEs). The skeleton tracking techniques which have been widely used for gesture recognition purposes showed common accuracy issues with Micro-gestures that can affect user's enjoyment. In this paper, we propose a multimodal interaction technique and test it using a designed wearable head-mounted tracker as a measurement instrument. We also designed a theoretical framework to resolve the weaknesses of Micro-Gestures recognition in CAVE (Cave Automatic Virtual Environment). The effectiveness of proposed method and its impact of user's joyfulness has been tested by a 3D gesture-based interface. The results showed improvement in user's enjoyment using the designed measuring and input method for navigation within a 3D CAVE by improving system's accuracy.

[1]  Ritch Macefield,et al.  How to specify the participant group size for usability studies: a practitioner's guide , 2009 .

[2]  Jean-Yves Fourniols,et al.  Smart wearable systems: Current status and future challenges , 2012, Artif. Intell. Medicine.

[3]  U. Sonn,et al.  Use of assistive devices – a reality full of contradictions in elderly persons' everyday life , 2007, Disability and rehabilitation. Assistive technology.

[4]  Francesco Bullo,et al.  On Coordinate-Free Rotation Decomposition: Euler Angles About Arbitrary Axes , 2012, IEEE Transactions on Robotics.

[5]  Ivan Poupyrev,et al.  An Introduction to 3-D User Interface Design , 2001, Presence: Teleoperators & Virtual Environments.

[6]  Dietmar Bauer,et al.  Hands-Free Navigation in Immersive Environments for the Evaluation of the Effectiveness of Indoor Navigation Systems , 2012, J. Virtual Real. Broadcast..

[7]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[8]  Joseph L. Gabbard A Taxonomy of Usability Characteristics in Virtual Environments , 1997 .

[9]  Christof Lutteroth,et al.  A quantitative quality model for gesture based user interfaces , 2011, OZCHI.

[10]  Chung-Yang Chen,et al.  The Key Success Factors of Wearable Computing Devices: An User-Centricity Perspective , 2014, WHICEB.

[11]  Willibald A. Günthner,et al.  Navigation in Virtual Reality using Microsoft Kinect , 2012 .

[12]  S. Engel Thought and Language , 1964 .

[13]  Robert W. Lindeman,et al.  Comparing isometric and elastic surfboard interfaces for leaning-based travel in 3D virtual environments , 2012, 2012 IEEE Symposium on 3D User Interfaces (3DUI).

[14]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Luc Van Gool,et al.  Real Time Head Pose Estimation from Consumer Depth Cameras , 2011, DAGM-Symposium.

[16]  Tim Roberts,et al.  Natural Full Body Interaction for Navigation in Dismounted Soldier Training , 2011 .

[17]  Ken Hinckley,et al.  A survey of design issues in spatial input , 1994, UIST '94.

[18]  Shamus P. Smith,et al.  Evaluation for the Design of Experience in Virtual Environments: Modeling Breakdown of Interaction and Illusion , 2001, Cyberpsychology Behav. Soc. Netw..

[19]  Darko Kirovski,et al.  Real-time classification of dance gestures from skeleton animation , 2011, SCA '11.

[20]  Luca Turchet,et al.  A multimodal architecture for simulating natural interactive walking in virtual environments , 2011, PsychNology J..

[21]  E. Piña,et al.  Rotations with Rodrigues’ vector , 2011 .

[22]  K. Hinckley Input technologies and techniques , 2002 .

[23]  Volker Wittstock,et al.  A Framework for User Tests in a Virtual Environment , 2012, ISVC.