An exploration of multi-finger interaction on multi-touch surfaces

Recent advances in touch sensing technologies have made it possible to interact with computers in a device-free manner, allowing for arguably more natural and intuitive input using multiple hands and fingers. Unfortunately, existing multi-point touch-sensitive devices have a number of sensor limitations which restrict the types of manipulations that can be performed. Additionally, while many well-studied techniques from the bimanual interaction literature are applicable to these emerging multi-point devices, there remain many unanswered questions as to how multiple fingers from a single hand can best be utilized on these touch-sensitive surfaces. This dissertation attempts to address some of these open issues. We first develop the Visual Touchpad, a low-cost vision-based input device that allows for detecting multiple hands and fingertips over a constrained planar surface. Unlike existing multi-point devices, the Visual Touchpad extracts a reliable 2D image of the entire hand that can be used to extract more detailed information about the fingers such as labels, orientation, and hover. We then design and implement three systems that leverage the capabilities of the Visual Touchpad to explore how multiple fingers could be used in real-world interface scenarios. Next we propose and experimentally validate a fluid interaction style that uses the thumb and index finger of a single hand in an asymmetric-dependent manner to control bi-digit widgets, where the index finger performs the primary and more frequent 2D tasks and the thumb performs secondary and less frequent tasks to support the index finger's manipulations. We then investigate the impact of visual feedback on the perception of finger span when using bi-digit widgets to merge command selection and direct manipulation. Results suggest that users are capable of selecting from up to 4 discrete commands with the thumb without any visual feedback, which allows us to design a set of more advanced bidigit widgets that facilitate smooth transitioning from novice to expert usage.

[1]  William Buxton,et al.  The limits of expert performance using hierarchic marking menus , 1993, INTERCHI.

[2]  A. Wing,et al.  Perceptual judgement, grasp point selection and object symmetry , 2003, Experimental Brain Research.

[3]  J. F. Soechting,et al.  Postural Hand Synergies for Tool Use , 1998, The Journal of Neuroscience.

[4]  Jörg Geißler Shuffle, throw or take it! working efficiently with an interactive wall , 1998, CHI Conference Summary.

[5]  B. Shneiderman,et al.  Improving the accuracy of touch screens: an experimental evaluation of three strategies , 1988, CHI '88.

[6]  W. Bu A MULTI-TOUCH THREE DIMENSIONAL TOUCH-SENSITIVE TABLET , 1985 .

[7]  Ying Wu,et al.  Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper , 2001, PUI '01.

[8]  Xing Chen,et al.  Lumipoint: multi-user laser-based interaction on large tiled displays , 2002 .

[9]  Heinrich Müller,et al.  Interaction with a projection screen using a camera-tracked laser pointer , 1998, Proceedings 1998 MultiMedia Modeling. MMM'98 (Cat. No.98EX200).

[10]  Emanuele Trucco,et al.  Introductory techniques for 3-D computer vision , 1998 .

[11]  J. Ponsford Tactile spatial resolution in blind Braille readers , 2000, Neurology.

[12]  Steven K. Feiner,et al.  Single-handed interaction techniques for multiple pressure-sensitive strips , 2004, CHI EA '04.

[13]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[14]  O. Faugeras,et al.  The Geometry of Multiple Images , 1999 .

[15]  Jun Rekimoto,et al.  Dual touch: a two-handed interface for pen-based PDAs , 2000, UIST '00.

[16]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[17]  John C. Tang,et al.  VideoWhiteboard: video shadows to support remote collaboration , 1991, CHI.

[18]  F. Kjeldsen,et al.  Visual interpretation for hand gestures a s a practical in-terface modality , 1997 .

[19]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[20]  Thomas P. Moran,et al.  Tivoli: an electronic whiteboard for informal workgroup meetings , 1993, INTERCHI.

[21]  Johan Himberg,et al.  On-line personalization of a touch screen based keyboard , 2003, IUI '03.

[22]  Ka-Ping Yee,et al.  Two-handed interaction on a tablet display , 2004, CHI EA '04.

[23]  Ravin Balakrishnan,et al.  The PadMouse: facilitating selection and spatial positioning for the non-dominant hand , 1998, CHI.

[24]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[25]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[26]  John F. Hughes,et al.  Principles and Applications of Multi-touch Interaction , 2007 .

[27]  K. Reilly,et al.  Human handedness: is there a difference in the independence of the digits on the preferred and non-preferred hands? , 2004, Experimental Brain Research.

[28]  Terry Winograd,et al.  Benefits of merging command selection and direct manipulation , 2005, TCHI.

[29]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.

[30]  Nicolas Roussel,et al.  Exploring New Uses of Video with VideoSpace , 2001, EHCI.

[31]  Shumin Zhai,et al.  The influence of muscle groups on performance of multiple degree-of-freedom input , 1996, CHI.

[32]  George W. Fitzmaurice,et al.  A remote control interface for large displays , 2004, UIST '04.

[33]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[34]  Paul P. Maglio,et al.  On Distinguishing Epistemic from Pragmatic Action , 1994, Cogn. Sci..

[35]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[36]  Takeo Igarashi,et al.  As-rigid-as-possible shape manipulation , 2005, ACM Trans. Graph..

[37]  Terry Winograd,et al.  Fluid interaction with high-resolution wall-size displays , 2001, UIST '01.

[38]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[39]  Takeo Igarashi,et al.  Flatland: new dimensions in office whiteboards , 1999, CHI '99.

[40]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[41]  Terry Winograd,et al.  FlowMenu: combining command, text, and data entry , 2000, UIST '00.

[42]  Greg Humphreys,et al.  Chromium: a stream-processing framework for interactive rendering on clusters , 2002, SIGGRAPH.

[43]  Zhengyou Zhang,et al.  Flexible camera calibration by viewing a plane from unknown orientations , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[44]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[45]  Azam Khan,et al.  Spotlight: Directing Users''Visual Attention on Large Displays , 2005 .

[46]  John F. Hughes,et al.  Multi-finger cursor techniques , 2006, Graphics Interface.

[47]  Wang Hui-nan Multi-Finger Gestural Interaction with 3D Volumetric Displays , 2008 .

[48]  Rick Kjeldsen,et al.  Design issues for vision-based computer interaction systems , 2001, PUI '01.

[49]  Anastasia Bezerianos,et al.  The vacuum: facilitating the manipulation of distant objects , 2005, CHI.

[50]  Yvonne Rogers,et al.  Dynamo: a public interactive surface supporting the cooperative sharing and exchange of media , 2003, UIST '03.

[51]  Steven D. Pieper,et al.  Hands-on interaction with virtual environments , 1989, UIST '89.

[52]  Hiroshi Ishii,et al.  ClearBoard: a seamless medium for shared drawing and conversation with eye contact , 1992, CHI.

[53]  Jock D. Mackinlay,et al.  A morphological analysis of the design space of input devices , 1991, TOIS.

[54]  Terry Winograd,et al.  PointRight: experience with flexible input redirection in interactive workspaces , 2002, UIST '02.

[55]  R Raj,et al.  Finger Dominance , 1999, Journal of hand surgery.

[56]  S. Kember,et al.  Interactive art installations:a new agenda for interaction design , 2004 .

[57]  David Stotts,et al.  The Vis-a-Vid Transparent Video Facetop , 2003 .

[58]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[59]  Paul Milgram,et al.  Measuring the allocation of control in a 6 degree-of-freedom docking experiment , 2000, CHI.

[60]  J F Soechting,et al.  Matching object size by controlling finger span and hand shape. , 1997, Somatosensory & motor research.

[61]  James M. Rehg,et al.  Shadow elimination and occluder light suppression for multi-projector displays , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[62]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[63]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[64]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[65]  S. Issanchou,et al.  Odor intensity evaluation in gas chromatography-olfactometry by finger span method. , 1999, Journal of agricultural and food chemistry.

[66]  Jaron Lanier,et al.  A hand gesture interface device , 1987, CHI 1987.

[67]  R. Bootsma,et al.  Two-handed performance of a rhythmical fitts task by individuals and dyads. , 2001, Journal of experimental psychology. Human perception and performance.

[68]  Joseph J. LaViola,et al.  FLEX AND PINCH: A CASE STUDY OF WHOLE HAND INPUT DESIGN FOR VIRTUAL ENVIRONMENT INTERACTION , 1999 .

[69]  Meredith Ringel Morris,et al.  Barehands: implement-free interaction with a wall-mounted display , 2001, CHI Extended Abstracts.

[70]  Shumin Zhai,et al.  Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices , 1998, CHI.

[71]  Mathias Kölsch,et al.  Keyboards without Keyboards: A Survey of Virtual Keyboards , 2002 .

[72]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[73]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[74]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[75]  Yoichi Sato,et al.  Real-Time Fingertip Tracking and Gesture Recognition , 2002, IEEE Computer Graphics and Applications.

[76]  M H Schieber,et al.  Quantifying the Independence of Human Finger Movements: Comparisons of Digits, Hands, and Movement Frequencies , 2000, The Journal of Neuroscience.

[77]  C L Van Doren Cross-modality matches of finger span and line length. , 1995, Perception & psychophysics.

[78]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[79]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[80]  William Buxton,et al.  Issues and techniques in touch-sensitive tablet input , 1985, SIGGRAPH '85.

[81]  Jakub Segen,et al.  Gesture VR: vision-based 3D hand interace for spatial interaction , 1998, MULTIMEDIA '98.

[82]  Armando Fox,et al.  The Interactive Workspaces Project: Experiences with Ubiquitous Computing Rooms , 2002, IEEE Pervasive Comput..

[83]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[84]  I. Scott MacKenzie,et al.  Performance differences in the fingers, wrist, and forearm in computer input control , 1997, CHI.

[85]  Dennis Proffitt,et al.  Cooperative bimanual action , 1997, CHI.

[86]  Yanqing Wang,et al.  The role of contextual haptic and visual constraints on object manipulation in virtual environments , 2000, CHI.

[87]  Atsushi Sugiura,et al.  A user interface using fingerprint recognition: holding commands and data objects on fingers , 1998, UIST '98.

[88]  T Lindvall,et al.  Perceived intensity of odor as a function of time of adaptation. , 1967, Scandinavian journal of psychology.

[89]  S. H. Sato,et al.  Interaction design for large displays , 1997, INTR.

[90]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[91]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[92]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[93]  Howard Poizner,et al.  Computer graphic modeling of american sign language , 1983, SIGGRAPH.

[94]  William Buxton,et al.  A three-state model of graphical input , 1990, INTERACT.

[95]  Ken Perlin,et al.  Quikwriting: continuous stylus-based text entry , 1998, UIST '98.

[96]  Bernd Fröhlich,et al.  Two-handed direct manipulation on the responsive workbench , 1997, SI3D.

[97]  Mike Sinclair,et al.  Touch-sensing input devices , 1999, CHI '99.

[98]  François Bérard,et al.  Bare-hand human-computer interaction , 2001, PUI '01.

[99]  Christine L. MacKenzie,et al.  Physical versus virtual pointing , 1996, CHI.

[100]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.