[POSTER] Usability Analysis of an Off-the-Shelf Hand Posture Estimation Sensor for Freehand Physical Interaction in Egocentric Mixed Reality

This paper explores freehand physical interaction in egocentric Mixed Reality by performing a usability study on the use of hand posture estimation sensors. We report on precision, interactivity and usability metrics in a task-based user study, exploring the importance of additional visual cues when interacting. A total of 750 interactions were recorded from 30 participants performing 5 different interaction tasks (Move, Rotate: Pitch (Y axis) and Yaw (Z axis), Uniform scale: enlarge and shrink). Additional visual cues resulted in an average shorter time to interact, however, no consistent statistical differences were found in between groups for performance and precision results. The group with additional visual cues gave the system and average System Usability Scale (SUS) score of 72.33 (SD 16.24) while the other scored a 68.0 (SD 18.68). Overall, additional visual cues made the system being perceived as more usable, despite the fact that the use of these two different conditions had limited effect on precision and interactivity metrics.

[1]  Pan Hui,et al.  Ubii: Physical World Interaction Through Augmented Reality , 2017, IEEE Transactions on Mobile Computing.

[2]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[3]  Philip T. Kortum,et al.  Determining what individual SUS scores mean: adding an adjective rating scale , 2009 .

[4]  Kenneth R. Moser,et al.  Evaluation of user-centric optical see-through head-mounted display calibration using a leap motion controller , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[5]  Gang Ren,et al.  3D selection with freehand gesture , 2013, Comput. Graph..

[6]  Mikkel Rønne Jakobsen,et al.  Eliciting Mid-Air Gestures for Wall-Display Interaction , 2016, NordiCHI.

[7]  Chris North,et al.  Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures , 2011, Int. J. Hum. Comput. Stud..

[8]  Mie Sato,et al.  Effects of auditory cues on grasping a virtual object with a bare hand , 2017, SIGGRAPH Posters.

[9]  Martin Hachet,et al.  Advances in Interaction with 3D Environments , 2015, Comput. Graph. Forum.

[10]  Yuan Yao,et al.  A Gestural Interface for Practicing Children's Spatial Skills , 2017, IUI Companion.

[11]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[12]  João Cardoso,et al.  Developing 3D Freehand Gesture-Based Interaction Methods for Virtual Walkthroughs: Using an Iterative Approach , 2016 .

[13]  David Kim,et al.  MixFab: a mixed-reality environment for personal fabrication , 2014, CHI.

[14]  Dragos Datcu,et al.  Free-hands interaction in augmented reality , 2013, SUI '13.

[15]  Ian Williams,et al.  Analysis of Medium Wrap Freehand Virtual Object Grasping in Exocentric Mixed Reality , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[16]  Kenneth R. Moser,et al.  Evaluation of hand and stylus based calibration for optical see-through head-mounted displays using leap motion , 2016, 2016 IEEE Virtual Reality (VR).

[17]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[18]  Sota Suzuki,et al.  Pseudo-softness evaluation in grasping a virtual object with a bare hand , 2016, SIGGRAPH Posters.

[19]  Frank Maurer,et al.  Gesture-driven Interactions on a Virtual Hologram in Mixed Reality , 2016, ISS Companion.

[20]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[21]  Mark Billinghurst,et al.  Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[22]  Rod Furlan The future of augmented reality: Hololens - Microsoft's AR headset shines despite rough edges [Resources_Tools and Toys] , 2016, IEEE Spectrum.

[23]  Jochen Huber,et al.  Towards effective interaction with omnidirectional videos using immersive virtual reality headsets , 2015, AH.

[24]  Ian Williams,et al.  Improving freehand placement for grasping virtual objects via dual view visual feedback in mixed reality , 2016, VRST.

[25]  Ying-Chao Tung,et al.  TranSection: Hand-Based Interaction for Playing a Game within a Virtual Reality Game , 2015, CHI Extended Abstracts.