Anyorbit: orbital navigation in virtual environments with eye-tracking

Gaze-based interactions promise to be fast, intuitive and effective in controlling virtual and augmented environments. Yet, there is still a lack of usable 3D navigation and observation techniques. In this work: 1) We introduce a highly advantageous orbital navigation technique, AnyOrbit, providing an intuitive and hands-free method of observation in virtual environments that uses eye-tracking to control the orbital center of movement; 2) The versatility of the technique is demonstrated with several control schemes and use-cases in virtual/augmented reality head-mounted-display and desktop setups, including observation of 3D astronomical data and spectator sports.

[1]  Jock D. Mackinlay,et al.  Rapid controlled movement through a virtual 3D workspace , 1990, SIGGRAPH.

[2]  Dan Witzner Hansen,et al.  Mobile gaze-based screen interaction in 3D environments , 2011, NGCA '11.

[3]  Thomas Wiegand,et al.  3D Video and Free Viewpoint Video - Technologies, Applications and MPEG Standards , 2006, 2006 IEEE International Conference on Multimedia and Expo.

[4]  Kai Kunze,et al.  AnyOrbit: Fluid 6DOF Spatial Navigation of Virtual Environments using Orbital Motion , 2016, SUI.

[5]  Robert J. K. Jacob,et al.  What you look at is what you get , 2016, Interactions.

[6]  Martin Hachet,et al.  Advances in Interaction with 3D Environments , 2015, Comput. Graph. Forum.

[7]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[8]  George W. Fitzmaurice,et al.  Safe 3D navigation , 2008, I3D '08.

[9]  James C. Chung,et al.  Intuitive navigation in the targeting of radiation therapy treatment beams , 1993 .

[10]  Desney S. Tan,et al.  Exploring 3D navigation: combining speed-coupled flying with orbiting , 2001, CHI.

[11]  Raimund Dachselt,et al.  Investigating gaze-supported multimodal pan and zoom , 2012, ETRA '12.

[12]  M StanneyKay,et al.  Locus of User-Initiated Control in Virtual Environments , 1998 .

[13]  Randy Pausch,et al.  A user study comparing head-mounted and stationary displays , 1993, Proceedings of 1993 IEEE Research Properties in Virtual Reality Symposium.

[14]  George W. Fitzmaurice,et al.  StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls , 2002, UIST '02.

[15]  Steven K. Feiner,et al.  Combating VR sickness through subtle dynamic field-of-view modification , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[16]  Kai Kunze,et al.  GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze , 2017, SIGGRAPH Posters.

[17]  Andrew S. Forsberg,et al.  UniCam—2D gestural camera controls for 3D environments , 1999, SI3D.

[18]  Richard H. Y. So,et al.  Visually induced motion sickness: Effects of translational visual motion along different axes , 2011 .

[19]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[20]  KongFatt Wong-Lin,et al.  A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair , 2017, 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[21]  Scott E. Hudson,et al.  Head-tracked orbital viewing: an interaction technique for immersive virtual environments , 1996, UIST '96.

[22]  I. Scott MacKenzie,et al.  The use of gaze to control drones , 2014, ETRA.

[23]  Kari-Jouko Räihä,et al.  Simple gaze gestures and the closure of the eyes as an interaction technique , 2012, ETRA.

[24]  George W. Fitzmaurice,et al.  HoverCam: interactive 3D navigation for proximal object inspection , 2005, I3D '05.

[25]  Michael Ortega-Binderberger,et al.  SHOCam: A 3D Orbiting Algorithm , 2015, UIST.

[26]  Norman I. Badler,et al.  Automatic viewing control for 3D direct manipulation , 1992, I3D '92.

[27]  Dan Witzner Hansen,et al.  Eye-based head gestures , 2012, ETRA.

[28]  Joseph Psotka,et al.  Immersive training systems: Virtual reality and education and training , 1995 .

[29]  Roman Bednarik,et al.  What do you want to do next: a novel approach for intent prediction in gaze-based interaction , 2012, ETRA.

[30]  Kay M. Stanney,et al.  Locus of User-Initiated Control in Virtual Environments: Influences on Cybersickness , 1998, Presence.