OrthoGaze: Gaze-based three-dimensional object manipulation using orthogonal planes

Abstract In virtual and augmented reality, gaze-based methods have been explored for decades as effective user interfaces for hands-free interaction. Though several well-known gaze-based methods exist for simple interactions such as selection, no solutions exist for 3D manipulation tasks requiring a higher degree of freedom (DoF). In this paper, we introduce OrthoGaze, a novel user interface that allows users to intuitively manipulate the three-dimensional position of a virtual object using only their eye or head gaze. Our approach makes use of three selectable, orthogonal planes, where each plane not only helps guide the user’s gaze in an arbitrary virtual space, but also allows for 2-DoF manipulations of object position. To evaluate our method, we conducted two user studies involving aiming and docking tasks in virtual reality to evaluate the fundamental characteristics of sustained gaze aiming and to determine which type of gaze-based control performs best when combined with OrthoGaze. Results showed that eye gaze was more accurate than head gaze for sustained aiming. Additionally, eye and head gaze-based control for 3D manipulations achieved 78% and 96% performance, respectively, in comparison with a hand-held controller. Subjective results also suggest that gaze-based manipulation can comprehensively cause more fatigue than controller-based. From the experimental results, we expect OrthoGaze to become an effective method for pure hands-free object manipulation in head-mounted displays.

[1]  Andreas Simon,et al.  NOYO: 6DOF elastic rate control for virtual environments , 2004, VRST '04.

[2]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[3]  Ferran Argelaguet,et al.  The role of interaction in virtual embodiment: Effects of the virtual hand representation , 2016, 2016 IEEE Virtual Reality (VR).

[4]  Arindam Dey,et al.  Estimating Gaze Depth Using Multi-Layer Perceptron , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).

[5]  Hans-Werner Gellersen,et al.  Gaze + pinch interaction in virtual reality , 2017, SUI.

[6]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[7]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[8]  Robert J. Teather,et al.  The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality , 2017, SUI.

[9]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[10]  T Usui,et al.  Adaptive changes in dynamic properties of human disparity-induced vergence. , 2001, Investigative ophthalmology & visual science.

[11]  Robert J. Teather,et al.  INSPECT: extending plane-casting for 6-DOF control , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[12]  David W. Murray,et al.  Object recognition and localization while tracking and mapping , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[13]  Michael Fanton,et al.  Spinal constraint modulates head instantaneous center of rotation and dictates head angular motion. , 2018, Journal of biomechanics.

[14]  R. Rosenthal Parametric measures of effect size. , 1994 .

[15]  Sungmin Cho,et al.  GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application , 2014, Comput. Aided Des..

[16]  Alberto Del Bimbo,et al.  Interacting through eyes , 1997, Robotics Auton. Syst..

[17]  Hans-Werner Gellersen,et al.  Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation , 2019, CHI.

[18]  Suranga Nanayakkara,et al.  EyeRing: a finger-worn input device for seamless interactions with our surroundings , 2013, AH.

[19]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[20]  V. Bruce,et al.  Do the eyes have it? Cues to the direction of social attention , 2000, Trends in Cognitive Sciences.

[21]  Anthony J. Hornof,et al.  Eyedraw: a system for drawing pictures with eye movements , 2003, ASSETS.

[22]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[23]  Daniel Sonntag,et al.  Attention Engagement and Cognitive State Analysis for Augmented Reality Text Display Functions , 2015, IUI.

[24]  Luc Van Gool,et al.  Combining RGB and ToF cameras for real-time 3D hand gesture interaction , 2011, WACV.

[25]  Jinwook Seo,et al.  Wall-based Space Manipulation Technique for Efficient Placement of Distant Objects in Augmented Reality , 2018, UIST.

[26]  Robert W. Lindeman,et al.  Exploring natural eye-gaze-based interaction for immersive virtual reality , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[27]  Bruce H. Thomas,et al.  Augmented reality working planes: a foundation for action and construction at a distance , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[28]  Maria V. Sanchez-Vives,et al.  Towards a Digital Body: The Virtual Arm Illusion , 2008, Frontiers in human neuroscience.

[29]  Andrea Bunt,et al.  Performer vs. observer: whose comfort level should we consider when examining the social acceptability of input modalities for head-worn display? , 2018, VRST.

[30]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[31]  Andreas Paepcke,et al.  Gaze-enhanced scrolling techniques , 2007, CHI Extended Abstracts.

[32]  Thies Pfeiffer,et al.  Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views , 2018, COGAIN@ETRA.

[33]  Dieter Schmalstieg,et al.  TrackCap: Enabling Smartphones for 3D Interaction on Mobile Head-Mounted Displays , 2019, CHI.