Multi-touch 3D positioning with the pantograph technique

One advantage of touch interaction is the sense of direct manipulation; there is perhaps no more-intuitive interface than just reaching out and touching virtual entities. However, direct manipulation is generally limited to objects located on the 2D display surface. For 3D spaces extending behind or in front of a touchscreen, the direct manipulation metaphor quickly falls apart. In these cases, gestures are needed to convert 2D finger positions into 3D cursor positions. This paper presents the pantograph technique, a simple two-finger interaction method for positioning a 3D cursor within mono and stereoscopic applications. The pantograph's pseudomechanical linkage between fingers and cursor provides helpful depth cues and maintains the sense of direct manipulation. Extensions to the technique, which integrate selection and other advanced actions, are explored within the context of real-world visual analysis applications. A series of human factors experiments showed that, while the pantograph technique outperformed other similar multitouch 3D positioning techniques, multi-touch was still inferior to other traditional, non-touch-based interfaces for sustained 3D positioning tasks.

[1]  Chi-Wing Fu,et al.  Multi-touch techniques for exploring large-scale 3D astrophysical simulations , 2010, CHI.

[2]  David H. Rogers,et al.  The Contribution of Stereoscopic and Motion Depth Cues to the Perception of Structures in 3D Point Clouds , 2018, ACM Trans. Appl. Percept..

[3]  M. Sheelagh T. Carpendale,et al.  Sticky tools: full 6DOF force-based interaction for multi-touch tables , 2009, ITS '09.

[4]  Thomas Butkiewicz,et al.  Multi-touch 3D exploratory analysis of ocean flow models , 2011, OCEANS'11 MTS/IEEE KONA.

[5]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[6]  Gudrun Klinker,et al.  Shadow tracking on multi-touch tables , 2008, AVI '08.

[7]  Jock D. Mackinlay,et al.  A morphological analysis of the design space of input devices , 1991, TOIS.

[8]  John F. Hughes,et al.  Indirect mappings of multi-touch input using one and two hands , 2008, CHI.

[9]  Robert J. K. Jacob,et al.  Integrality and separability of input devices , 1994, TCHI.

[10]  Hans-Werner Gellersen,et al.  Comparing indirect and direct touch in a stereoscopic interaction task , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[11]  Klaus H. Hinrichs,et al.  Triangle cursor: interactions with objects above the tabletop , 2011, ITS '11.

[12]  Klaus H. Hinrichs,et al.  Bimanual Interaction with Interscopic Multi-Touch Surfaces , 2009, INTERACT.

[13]  M. Sheelagh T. Carpendale,et al.  Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques , 2007, CHI.

[14]  Elisabeth André,et al.  Hand distinction for multi-touch tabletop interaction , 2009, ITS '09.

[15]  Tobias Isenberg,et al.  Towards An Understanding of Mobile Touch Navigation in a Stereoscopic Viewing Environment for 3D Data Exploration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[16]  Daniel J. Wigdor,et al.  Ripples: utilizing per-contact visualizations to improve user interaction with touch displays , 2009, UIST '09.

[17]  Laurent Grisoni,et al.  The design and evaluation of 3D positioning techniques for multi-touch displays , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[18]  Timo Ropinski,et al.  Interscopic User Interface Concepts for Fish Tank Virtual Reality Systems , 2007, 2007 IEEE Virtual Reality Conference.

[19]  B. Shneiderman,et al.  Improving the accuracy of touch screens: an experimental evaluation of three strategies , 1988, CHI '88.

[20]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[21]  David M. Hoffman,et al.  Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. , 2008, Journal of vision.

[22]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[23]  Steven K. Feiner,et al.  Balloon Selection: A Multi-Finger Technique for Accurate Low-Fatigue 3D Selection , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[24]  Ravin Balakrishnan,et al.  Reaching for objects in VR displays: lag and frame rate , 1994, TCHI.

[25]  Colin Ware,et al.  Using the bat: a six-dimensional mouse for object placement , 1988, IEEE Computer Graphics and Applications.

[26]  Thomas Butkiewicz A More Flexible Approach to Utilizing Depth Cameras for Hand & Touch Interaction , 2012 .

[27]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[28]  Martin Hachet,et al.  Toucheo: multitouch and stereo combined in a seamless workspace , 2011, UIST.

[29]  Dan Venolia Facile 3D direct manipulation , 1993, CHI '93.

[30]  Adalberto Lafcadio Simeone Indirect touch manipulation for interaction with stereoscopic displays , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[31]  Kellogg S. Booth,et al.  Fish tank virtual reality , 1993, INTERCHI.

[32]  Laurent Grisoni,et al.  The effect of DOF separation in 3D manipulation tasks with multi-touch displays , 2010, VRST '10.

[33]  Shumin Zhai,et al.  The “Silk Cursor”: investigating transparency for 3D target acquisition , 1994, CHI '94.