Identifying the Usability Factors of Mid-Air Hand Gestures for 3D Virtual Model Manipulation

Although manipulating 3D virtual models with mid-air hand gestures had the benefits of natural interactions and free from the sanitation problems of touch surfaces, many factors could influence the usability of such an interaction paradigm. In this research, the authors conducted experiments to study the vision-based mid-air hand gestures for scaling, translating, and rotating a 3D virtual car displayed on a large screen. An Intel RealSense 3D Camera was employed for hand gesture recognition. The two-hand gesture with grabbing then moving apart/close to each other was applied to enlarging/shrinking the 3D virtual car. The one-hand gesture with grabbing then moving was applied to translating a car component. The two-hand gesture with grabbing and moving relatively along the circumference of a horizontal circle was applied to rotating the car. Seventeen graduate students were invited to participate in the experiments and offer their evaluations and comments for gesture usability. The results indicated that the width and depth of detection ranges were the key usability factors for two-hand gestures with linear motions. For dynamic gestures with quick transitions and motions from open to close hand poses, ensuring gesture recognition robustness was extremely important. Furthermore, given a gesture with ergonomic postures, inappropriate control-response ratio could result in fatigue due to repetitive exertions of hand gestures for achieving the precise controls of 3D model manipulation tasks.

[1]  Hiroaki Nishino,et al.  A Desktop 3D Modeling System Controllable by Mid-air Interactions , 2016, 2016 10th International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS).

[2]  Judy Kay,et al.  An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display , 2015, UbiComp.

[3]  Karthik Ramani,et al.  A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery , 2015, Comput. Aided Des..

[4]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[5]  Mikkel Rønne Jakobsen,et al.  Eliciting Mid-Air Gestures for Wall-Display Interaction , 2016, NordiCHI.

[6]  Juan Pablo Wachs,et al.  A User-Developed 3-D Hand Gesture Set for Human–Computer Interaction , 2015, Hum. Factors.

[7]  Hazem Wannous,et al.  Skeleton-Based Dynamic Hand Gesture Recognition , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[8]  Kenton O'Hara,et al.  Touchless interaction in surgery , 2014, CACM.

[9]  Bernhard Preim,et al.  Exploration of 3D Medical Image Data for Interventional Radiology using Myoelectric Gesture Control , 2015, VCBM.

[10]  Min K. Chung,et al.  A taxonomy and notation method for three-dimensional hand gestures , 2014 .

[11]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[12]  Daniel Mendes,et al.  The benefits of DOF separation in mid-air 3D object manipulation , 2016, VRST.

[13]  Davide Bolchini,et al.  Understanding Visual Feedback in Large-Display Touchless Interactions: An Exploratory Study , 2014 .

[14]  Guillermo M. Rosa,et al.  Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report , 2014, Imaging science in dentistry.

[15]  Wei-Yang Lin,et al.  A multimedia presentation system using a 3D gesture interface in museums , 2012, Multimedia Tools and Applications.

[16]  Martin Saerbeck,et al.  Recent methods and databases in vision-based hand gesture recognition: A review , 2015, Comput. Vis. Image Underst..

[17]  Joseph J. LaViola,et al.  3D Gestural Interaction: The State of the Field , 2013 .

[18]  Nathaniel Rossol,et al.  Touchfree medical interfaces , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[19]  Daniel Mendes,et al.  3D mid-air manipulation techniques above stereoscopic tabletops , 2013 .

[20]  Peter J. Passmore,et al.  Understanding 3D Mid-Air Hand Gestures with Interactive Surfaces and Displays: A Systematic Literature Review , 2016, BCS HCI.

[21]  Arjan Kuijper,et al.  Mid-Air Gestures for Virtual Modeling with Leap Motion , 2016, HCI.

[22]  Andrea Giachetti,et al.  Gestural Interaction and Navigation Techniques for Virtual Museum Experiences , 2016, AVI*CH.

[23]  Karthik Ramani,et al.  Extracting hand grasp and motion for intent expression in mid-air shape deformation: A concrete and iterative exploration through a virtual pottery application , 2016, Comput. Graph..

[24]  Alexis Héloir,et al.  Introducing postural variability improves the distribution of muscular loads during mid-air gestural interaction , 2016, MIG.

[25]  Arjan Kuijper,et al.  Understanding People's Mental Models of Mid-Air Interaction for Virtual Assembly and Shape Modeling , 2016, CASA.