Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops

Selecting and moving digital content on interactive tabletops often involves accessing the workspace beyond arm's reach. We present Pointable, an in-air, bimanual perspective-based interaction technique that augments touch input on a tabletop for distant content. With Pointable, the dominant hand selects remote targets, while the non-dominant hand can scale and rotate targets with a dynamic C/D gain. We conducted 3 experiments; the first showed that pointing at a distance using Pointable has a Fitts' law throughput comparable to that of a mouse. In the second experiment, we found that Pointable had the same performance as multi-touch input in a resize, rotate and drag task. In a third study, we observed that when given the choice, over 75% of participants preferred to use Pointable over multi-touch for target manipulation. In general, Pointable allows users to manipulate out-of-reach targets, without loss of performance, while minimizing the need to lean, stand up, or involve collocated collaborators.

[1]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[2]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[3]  Carl Gutwin,et al.  The effects of interaction technique on coordination in tabletop groupware , 2007, GI '07.

[4]  Kellogg S. Booth,et al.  Shadow Reaching : A New Perspective on Interaction for Large Wall Displays , 2007 .

[5]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[6]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[7]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[8]  Bruce H. Thomas,et al.  Applying reach in direct manipulation user interfaces , 2006, OZCHI '06.

[9]  Carl Gutwin,et al.  Superflick: a natural and efficient technique for long-distance object placement on digital tables , 2006, Graphics Interface.

[10]  I. Scott MacKenzie,et al.  Fitts' throughput and the speed-accuracy tradeoff , 2008, CHI.

[11]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[12]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[13]  Regan L. Mandryk,et al.  An evaluation of coordination techniques for protecting objects and territories in tabletop groupware , 2009, CHI.

[14]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[15]  Ji-Hyung Park,et al.  I-Grabber: expanding physical reach in a large-display tabletop environment through the use of a virtual grabber , 2009, ITS '09.

[16]  Anthony Tang,et al.  Shadow reaching: a new perspective on interaction for large displays , 2007, UIST.

[17]  Anastasia Bezerianos,et al.  The vacuum: facilitating the manipulation of distant objects , 2005, CHI.

[18]  Wang Hui-nan Multi-Finger Gestural Interaction with 3D Volumetric Displays , 2008 .

[19]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[20]  Andrew E. Johnson,et al.  Withindows: A Framework for Transitional Desktop and Immersive User Interfaces , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[21]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[22]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[23]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[24]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[25]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[26]  A. Welford Fundamentals of skill / A.T. Welford , 1968 .

[27]  Charles L. A. Clarke,et al.  Bimanual and unimanual image alignment: an evaluation of mouse-based techniques , 2005, UIST '05.

[28]  Regan L. Mandryk,et al.  TractorBeam Selection Aids: Improving Target Acquisition for Pointing Input on Tabletop Displays , 2005, INTERACT.

[29]  M. Sheelagh T. Carpendale,et al.  A comparison of ray pointing techniques for very large displays , 2010, Graphics Interface.

[30]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[31]  Regan L. Mandryk,et al.  TractorBeam: seamless integration of local and remote pointing for tabletop displays , 2005, Graphics Interface.

[32]  艾而帝,et al.  Microsoft Kinect 虛擬復健系統設計 , 2013 .

[33]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[34]  Randy F. Pausch,et al.  Comparing voodoo dolls and HOMER: exploring the importance of feedback in virtual environments , 2002, CHI.

[35]  Meredith Ringel Morris,et al.  Barehands: implement-free interaction with a wall-mounted display , 2001, CHI Extended Abstracts.

[36]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[37]  George W. Fitzmaurice,et al.  A remote control interface for large displays , 2004, UIST '04.

[38]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[39]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[40]  M. Sheelagh T. Carpendale,et al.  Territoriality in collaborative tabletop workspaces , 2004, CSCW.