A Comparative Analysis of 3D User Interaction: How to Move Virtual Objects in Mixed Reality

Using one’s hands can be a natural and intuitive method for interacting with 3D objects in a mixed reality environment. This study explores three hand-interaction techniques, including the gaze and pinch, touch and grab, and worlds-in-miniature interaction for selecting and moving virtual furniture in the 3D scene. Overall, a comparative analysis reveals that the worlds-in-miniature provided the best usability and task performance than other studied techniques. We also conducted in-depth interviews and analyzed participants’ hand gestures in order to identify desired attributes for 3D hand interaction design. Findings from interviews suggest that, when it comes to enjoyment and discoverability, users prefer directly manipulating the virtual furniture to interacting with objects remotely or using in-direct interactions such as gaze. Another insight this study provides is the critical roles of the virtual object’s visual appearance in designing natural hand interaction. Gesture analysis reveals that shapes of furniture, as well as its perceived features such as weight, largely determined the participant’s instinctive form of hand interaction (i.e., lift, grab, push). Based on these findings, we present design suggestions that can aid 3D interaction designers to develop a natural and intuitive hand interaction for mixed reality.

[1]  Alberto Del Bimbo,et al.  Visual capture and understanding of hand pointing actions in a 3-D environment , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[2]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[3]  A. Kendon Some Relationships Between Body Motion and Speech , 1972 .

[4]  Woontack Woo,et al.  Metaphoric Hand Gestures for Orientation-Aware VR Object Manipulation With an Egocentric Viewpoint , 2017, IEEE Transactions on Human-Machine Systems.

[5]  Antti Oulasvirta,et al.  Real-Time Joint Tracking of a Hand Manipulating an Object from RGB-D Input , 2016, ECCV.

[6]  Carl Gutwin,et al.  Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback , 2011, Int. J. Hum. Comput. Stud..

[7]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[8]  Fabrice Matulic,et al.  Multiray: Multi-Finger Raycasting for Large Displays , 2018, CHI.

[9]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[10]  Tom Page,et al.  Skeuomorphism or flat design: future directions in mobile device User Interface (UI) design education , 2014, Int. J. Mob. Learn. Organisation.

[11]  Doug A. Bowman,et al.  A Survey of Usability Evaluation in Virtual Environments: Classification and Comparison of Methods , 2002, Presence: Teleoperators & Virtual Environments.

[12]  Doug A. Bowman,et al.  Separating the effects of level of immersion and 3D interaction techniques , 2006, VRST '06.

[13]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[14]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[15]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[16]  Bruce A. Draper,et al.  Exploring the Use of Gesture in Collaborative Tasks , 2017, CHI Extended Abstracts.

[17]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[18]  Hind Kharoub,et al.  3D User Interface Design and Usability for Immersive VR , 2019, Applied Sciences.

[19]  Mindy Seto,et al.  Investigating menu discoverability on a digital tabletop in a public setting , 2012, ITS '12.

[20]  Fred D. Davis,et al.  User Acceptance of Computer Technology: A Comparison of Two Theoretical Models , 1989 .

[21]  Douglas A. Bowman,et al.  Interaction Techniques For Common Tasks In Immersive Virtual Environments - Design, Evaluation, And , 1999 .

[22]  Mark Billinghurst,et al.  Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[23]  Ana M. Bernardos,et al.  A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments , 2016, Int. J. Hum. Comput. Interact..

[24]  Eric D. Ragan,et al.  Questioning naturalism in 3D user interfaces , 2012, CACM.

[25]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[26]  Larry F. Hodges,et al.  User Interface Constraints for Immersive Virtual Environment Applications , 1995 .

[27]  Andrew Blake,et al.  The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction , 2015, INTERACT.

[28]  三嶋 博之 The theory of affordances , 2008 .

[29]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[30]  Leilah Lyons,et al.  Framed Guessability: Improving the Discoverability of Gestures and Body Movements for Full-Body Interaction , 2018, CHI.

[31]  Panayiotis Koutsabasis,et al.  Gesture Elicitation Studies for Mid-Air Interaction: A Review , 2018, Multimodal Technol. Interact..

[32]  Max Pfeiffer,et al.  3D virtual hand pointing with EMS and vibration feedback , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[33]  Jeremy Hales,et al.  Interacting with Objects in the Environment by Gaze and Hand Gestures , 2013 .

[34]  François Bérard,et al.  Bare-hand human-computer interaction , 2001, PUI '01.

[35]  Amnon Shashua,et al.  Projective depth: A geometric invariant for 3D reconstruction from two perspective/orthographic views and for visual recognition , 1993, 1993 (4th) International Conference on Computer Vision.

[36]  Shaun J. Canavan,et al.  A Multi-Gesture Interaction System Using a 3-D Iris Disk Model for Gaze Estimation and an Active Appearance Model for 3-D Hand Pointing , 2011, IEEE Transactions on Multimedia.

[37]  Woontack Woo,et al.  3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric Viewpoint , 2015, IEEE Transactions on Visualization and Computer Graphics.

[38]  Jason Jerald,et al.  The VR Book: Human-Centered Design for Virtual Reality , 2015 .

[39]  Shaun J. Canavan,et al.  Hand Pointing Estimation for Human Computer Interaction Based on Two Orthogonal-Views , 2010, 2010 20th International Conference on Pattern Recognition.