Imperceptible depth shifts for touch interaction with stereoscopic objects

While touch technology has proven its usability for 2D interaction and has already become a standard input modality for many devices, the challenges to exploit its applicability with stereoscopically rendered content have barely been studied. In this paper we exploit the properties of the visual perception to allow users to touch stereoscopically displayed objects when the input is constrained to a 2D surface. Therefore, we have extended and generalized recent evaluations on the user's ability to discriminate small induced object shifts while reaching out to touch a virtual object, and we propose a practical interaction technique, the attracting shift technique, suitable for numerous application scenarios where shallow depth interaction is sufficient. In addition, our results indicate that slight object shifts during touch interaction make the virtual scene appear perceptually more stable compared to a static scene. As a consequence, applications have to manipulate the virtual objects to make them appear static for the user.

[1]  Klaus H. Hinrichs,et al.  2d touching of 3d stereoscopic objects , 2011, CHI.

[2]  Dieter Schmalstieg,et al.  Using transparent props for interaction with the virtual table , 1999, SI3D.

[3]  Martin Hachet,et al.  Toucheo: multitouch and stereo combined in a seamless workspace , 2011, UIST.

[4]  George Mather,et al.  Foundations of Sensation and Perception , 2016 .

[5]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[6]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[7]  Leslie G. Ungerleider Two cortical visual systems , 1982 .

[8]  Wolfgang Stuerzlinger,et al.  Moving objects with 2D input devices in CAD systems and Desktop Virtual Environments , 2005, Graphics Interface.

[9]  Klaus H. Hinrichs,et al.  Triangle cursor: interactions with objects above the tabletop , 2011, ITS '11.

[10]  Klaus H. Hinrichs,et al.  Evaluation of depth perception for touch interaction with stereoscopic rendered objects , 2012, ITS.

[11]  Timo Ropinski,et al.  Urban city planning in semi-immersive virtual reality systems , 2006, GRAPP.

[12]  M. Sheelagh T. Carpendale,et al.  Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques , 2007, CHI.

[13]  Sharif Razzaque,et al.  The hand is slower than the eye: a quantitative exploration of visual dominance over proprioception , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[14]  Jane Yung-jen Hsu,et al.  Touching the void: direct-touch interaction for intangible displays , 2010, CHI.

[15]  Luv Kohli,et al.  Redirected touching: Warping space to remap passive haptics , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[16]  Jari Takatalo,et al.  Evaluation of stereoscopic image quality for mobile devices using interpretation based quality methodology , 2009, Electronic Imaging.

[17]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[18]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.

[19]  Steven K. Feiner,et al.  Balloon Selection: A Multi-Finger Technique for Accurate Low-Fatigue 3D Selection , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[20]  Robert J. Teather,et al.  Pointing at 3d target projections with one-eyed and stereo cursors , 2013, CHI.

[21]  Laurent Grisoni,et al.  The design and evaluation of 3D positioning techniques for multi-touch displays , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[22]  Robert J. Teather,et al.  Guidelines for 3D positioning techniques , 2007, Future Play.

[23]  Klaus H. Hinrichs,et al.  Bimanual Interaction with Interscopic Multi-Touch Surfaces , 2009, INTERACT.

[24]  Gerd Bruder,et al.  Touching the Void Revisited: Analyses of Touch Behavior on and above Tabletop Surfaces , 2013, INTERACT.

[25]  J. Enns,et al.  Attention for perception and action: task interference for action planning, but not for online control , 2008, Experimental Brain Research.

[26]  Heidrun Schumann,et al.  Tangible views for information visualization , 2010, ITS '10.

[27]  Marcel P. Lucassen,et al.  Visual comfort of binocular and 3D displays , 2001, IS&T/SPIE Electronic Imaging.

[28]  R. Mansfield,et al.  Analysis of visual behavior , 1982 .

[29]  M. Goodale,et al.  Separate visual pathways for perception and action , 1992, Trends in Neurosciences.