Design of unimanual multi-finger pie menu interaction

Context menus, most commonly the right click menu, are a traditional method of interaction when using a keyboard and mouse. Context menus make a subset of commands in the application quickly available to the user. However, on tabletop touchscreen computers, context menus have all but disappeared. In this paper, we investigate how to design context menus for efficient unimanual multi-touch use. We investigate the limitations of the arm, wrist, and fingers and how it relates to human performance of multi-targets selection tasks on multi-touch surface. We show that selecting targets with multiple fingers simultaneously improves the performance of target selection compared to traditional single finger selection, but also increases errors. Informed by these results, we present our own context menu design for horizontal tabletop surfaces.

[1]  Tovi Grossman,et al.  The design and evaluation of multitouch marking menus , 2010, CHI.

[2]  M H Schieber,et al.  Quantifying the Independence of Human Finger Movements: Comparisons of Digits, Hands, and Movement Frequencies , 2000, The Journal of Neuroscience.

[3]  C L Van Doren Cross-modality matches of finger span and line length. , 1995, Perception & psychophysics.

[4]  Gilles Bailly,et al.  Finger-count & radial-stroke shortcuts: 2 techniques for augmenting linear menus on multi-touch surfaces , 2010, CHI.

[5]  S. T. Pheasant,et al.  Bodyspace : anthropometry, ergonomics and design , 1986 .

[6]  Chiew-Lan Tai,et al.  Multitouch finger registration and its applications , 2010, OZCHI '10.

[7]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[8]  John F. Hughes,et al.  Indirect mappings of multi-touch input using one and two hands , 2008, CHI.

[9]  J F Soechting,et al.  Matching object size by controlling finger span and hand shape. , 1997, Somatosensory & motor research.

[10]  Tovi Grossman,et al.  The design and evaluation of multi-finger mouse emulation techniques , 2009, CHI.

[11]  Alan Esenther,et al.  Fluid DTMouse: better mouse support for touch-based interactions , 2006, AVI '06.

[12]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[13]  Maneesh Agrawala,et al.  Zone and polygon menus: using relative position to increase the breadth of multi-stroke marking menus , 2006, CHI.

[14]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[15]  Marwin Schmitt,et al.  Stacked Half-Pie menus: navigating nested menus on interactive tabletops , 2009, ITS '09.

[16]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[17]  Peter Brandl,et al.  Occlusion-aware menu design for digital tabletops , 2009, CHI Extended Abstracts.

[18]  M. Weiser,et al.  An empirical comparison of pie vs. linear menus , 1988, CHI '88.

[19]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[20]  K. Reilly,et al.  Human handedness: is there a difference in the independence of the digits on the preferred and non-preferred hands? , 2004, Experimental Brain Research.

[21]  Chris North,et al.  Understanding Multi-touch Manipulation for Surface Computing , 2009, INTERACT.

[22]  Tony DeRose,et al.  Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation , 2009, Graphics Interface.

[23]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[24]  Patrick Olivier,et al.  SurfaceMouse: supplementing multi-touch interaction with a virtual mouse , 2011, Tangible and Embedded Interaction.

[25]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[26]  Gilles Bailly,et al.  MultiTouch menu (MTM) , 2008, IHM '08.

[27]  Daniel Vogel,et al.  Occlusion-aware interfaces , 2010, CHI.

[28]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[29]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[30]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI 1994.

[31]  Yang Li,et al.  Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.

[32]  Don Hopkins,et al.  The design and implementation of pie menus , 1991 .

[33]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[34]  George W. Fitzmaurice,et al.  The Hotbox: efficient access to a large number of menu-items , 1999, CHI '99.

[35]  Katherine M. Tsui,et al.  Design and validation of two-handed multi-touch tabletop controllers for robot teleoperation , 2011, IUI '11.

[36]  I. Scott MacKenzie,et al.  Copyright 2009 by Human Factors and Ergonomics Society, Inc. All rights reserved. 10.1518/107118109X12524443347715 , 2009 .