3D Mirrored Object Selection for Occluded Objects in Virtual Environments

There is an increasing demand for the manipulation of virtual objects in 3D virtual reality (VR) space, which begins with the user selecting a desired object. Existing selection methods aim to provide an intuitive and natural experience that minimizes user fatigue. Although typical ray-casting methods are effective when objects in the virtual space are sparsely or evenly placed, the selection becomes challenging when these objects are smaller or farther away, particularly when they overlap. In this study, we propose a method to address these challenges via an effective selection process. Based on 3D mirroring, the proposed method provides a new interaction metaphor that allows for a more efficient selection of occluded virtual objects. We conduct systematic experiments with 21 subjects and use various levels of target visibility determined by object size, spatial density, and degrees of occlusion. The results indicate that, in occluded 3D environments, the proposed 3D mirrored selection process outperforms traditional gaze-supported selection processes and offers better user satisfaction.

[1]  Doug A. Bowman,et al.  Design and evaluation of 3D selection techniques based on progressive refinement , 2013, Int. J. Hum. Comput. Stud..

[2]  Andrew T. Duchowski,et al.  Gaze- vs. hand-based pointing in virtual environments , 2003, CHI Extended Abstracts.

[3]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[4]  Niklas Elmqvist,et al.  BalloonProbe: reducing occlusion in 3D using interactive space distortion , 2005, VRST '05.

[5]  Niklas Elmqvist,et al.  View-projection animation for 3D occlusion management , 2007, Comput. Graph..

[6]  Karin Coninx,et al.  Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[7]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[8]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[9]  Frédo Durand,et al.  A Survey of Visibility for Walkthrough Applications , 2003, IEEE Trans. Vis. Comput. Graph..

[10]  Kunhee Ryu,et al.  GG Interaction: a gaze–grasp pose interaction for 3D virtual object selection , 2019, Journal on Multimodal User Interfaces.

[11]  Tovi Grossman,et al.  Multimodal selection techniques for dense and occluded 3D virtual environments , 2009, Int. J. Hum. Comput. Stud..

[12]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[13]  Robert Xiao,et al.  Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions , 2015, ICMI.

[14]  Daisuke Sakamoto,et al.  Bubble Gaze Cursor + Bubble Gaze Lens: Applying Area Cursor Technique to Eye-Gaze Interface , 2020, ETRA.

[15]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[16]  Gang Ren,et al.  3D selection with freehand gesture , 2013, Comput. Graph..

[17]  Andrea Giachetti,et al.  A Survey on 3D Virtual Object Manipulation: From the Desktop to Immersive Virtual Environments , 2018, Comput. Graph. Forum.

[18]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[19]  Géry Casiez,et al.  RayCursor: A 3D Pointing Facilitation Technique based on Raycasting , 2019, CHI.

[20]  Chi-Wing Fu,et al.  DualGaze: Addressing the Midas Touch Problem in Gaze Mediated VR Interaction , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[21]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[22]  Adam Finkelstein,et al.  Adaptive cutaways for comprehensible rendering of polygonal scenes , 2008, SIGGRAPH Asia '08.

[23]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[24]  George W. Fitzmaurice,et al.  StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls , 2002, UIST '02.

[25]  Hans-Werner Gellersen,et al.  Gaze + pinch interaction in virtual reality , 2017, SUI.

[26]  Ferran Argelaguet,et al.  A survey of 3D object selection techniques for virtual environments , 2013, Comput. Graph..

[27]  Hans-Werner Gellersen,et al.  An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation , 2015, INTERACT.

[28]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[29]  Leena Arhippainen,et al.  Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces , 2012, OZCHI.

[30]  Nan Jiang,et al.  Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation , 2017, Int. J. Hum. Comput. Stud..

[31]  Ravin Balakrishnan,et al.  Using deformations for browsing volumetric data , 2003, IEEE Visualization, 2003. VIS 2003..

[32]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[33]  Joseph J. LaViola,et al.  Dense and Dynamic 3D Selection for Game-Based Virtual Environments , 2012, IEEE Transactions on Visualization and Computer Graphics.

[34]  Xuesong Zhang,et al.  Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual Reality , 2020, CHI.