From Direct manipulation to Gestures

Optimizing the bandwidth of the communication channel between users and the system is fundamental for designing efficient interactive systems. Apart from the case of speech-based interfaces that rely on users’ natural language, this entails designing an efficient language that users can adopt and that the system can understand. My research has been focusing on studying and optimizing the two fol- lowing types of languages: interfaces that allow users to trigger actions through the direct manipulation of on-screen objects, and interactive systems that allow users to invoke commands by performing specific movements. Direct manipulation re- quires encoding most information in the graphical representation, mostly relying on users’ ability to recognize visual elements; whereas gesture-based interaction interprets the shape and dynamics of users’ movements, mostly relying on users’ ability to recall specific movements. This manuscript presents my main research projects about these two types of language, and discusses how we can increase the efficiency of interactive systems that make use of them. When using direct manip- ulation, achieving a high expressive power and a good level of usability depends on the interface’s ability to accommodate large graphical scenes while enabling the easy selection and manipulation of objects in the scene. When using gestures, it depends on the number of different gestures in the system’s vocabulary, as well as on the simplicity of those gestures, that should remain easy to learn and execute. I conclude with directions for future work around interaction with tangible objects.

[1]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[2]  Carl Gutwin,et al.  Revisiting read wear: analysis, design, and evaluation of a footprints scrollbar , 2009, CHI.

[3]  Michael Rohs,et al.  CapWidgets: tangile widgets versus multi-touch controls on mobile devices , 2011, CHI Extended Abstracts.

[4]  Atul Prakash,et al.  A framework for undoing actions in collaborative systems , 1994, TCHI.

[5]  Steven K. Feiner,et al.  A history-based macro by example system , 1992, UIST '92.

[6]  Olivier Chapuis,et al.  Evaluation of pointing performance on screen edges , 2008, AVI '08.

[7]  Bertrand Schneider,et al.  Benefits of a Tangible Interface for Collaborative Learning and Interaction , 2011, IEEE Transactions on Learning Technologies.

[8]  Elizabeth D. Mynatt,et al.  Timewarp: techniques for autonomous collaboration , 1997, CHI.

[9]  Olivier Chapuis,et al.  Dwell-and-spring: undo for direct manipulation , 2012, CHI.

[10]  James A. Landay,et al.  Visual similarity of pen gestures , 2000, CHI.

[11]  Jan O. Borchers,et al.  Improving touch accuracy on large tabletops using predecessor and successor , 2013, CHI.

[12]  William Buxton,et al.  An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input , 1997, CHI.

[13]  Ian Oakley,et al.  Designing tangible magnetic appcessories , 2013, TEI '13.

[14]  Benjamin B. Bederson,et al.  Space-scale diagrams: understanding multiscale interfaces , 1995, CHI '95.

[15]  Elizabeth D. Mynatt,et al.  Variation in element and action: supporting simultaneous development of alternative solutions , 2004, CHI.

[16]  Jörg Müller,et al.  Design and evaluation of finger-count interaction: Combining multitouch gestures and menus , 2012, Int. J. Hum. Comput. Stud..

[17]  Pierre Dragicevic,et al.  Strategies for accelerating on-line learning of hotkeys , 2007, CHI.

[18]  Yiya Yang,et al.  Anatomy of the Design of an Undo Support Facility , 1992, Int. J. Man Mach. Stud..

[19]  Emmanuel Pietriga,et al.  Pointing and beyond: an operationalization and preliminary evaluation of multi-scale searching , 2007, CHI.

[20]  Caroline Appert,et al.  Prospective motor control on tabletops: planning grasp for multitouch interaction , 2014, CHI.

[21]  Yuichi Itoh,et al.  PUCs: detecting transparent, passive untouched capacitive widgets on unmodified multi-touch displays , 2013, ITS.

[22]  Charles C. Tappert,et al.  Cursive Script Recognition by Elastic Matching , 1982, IBM J. Res. Dev..

[23]  Joëlle Coutaz,et al.  A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.

[24]  Daniel Vogel,et al.  Conté: multimodal input inspired by an artist's crayon , 2011, UIST.

[25]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[26]  Scott R. Klemmer,et al.  Where do web sites come from?: capturing and interacting with design history , 2002, CHI.

[27]  G. Cumming,et al.  Inference by eye: confidence intervals and how to read pictures of data. , 2005, The American psychologist.

[28]  Stacey D. Scott,et al.  Regional undo/redo techniques for large interactive surfaces , 2012, CHI.

[29]  Emmanuel Pietriga,et al.  Exploratory visualization of astronomical data on ultra-high-resolution wall displays , 2016, Astronomical Telescopes + Instrumentation.

[30]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[31]  Palmer Morrel-Samuels,et al.  Clarifying the Distinction Between Lexical and Gestural Commands , 1990, Int. J. Man Mach. Stud..

[32]  Gregory D. Abowd,et al.  Giving Undo Attention , 1992, Interact. Comput..

[33]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[34]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[35]  James A. Landay,et al.  SATIN: a toolkit for informal ink-based applications , 2000, UIST '00.

[36]  Brad A. Myers,et al.  An Exploratory Study of How Developers Seek, Relate, and Collect Relevant Information during Software Maintenance Tasks , 2006, IEEE Transactions on Software Engineering.

[37]  Hideki Koike,et al.  Transparent 2-D markers on an LCD tabletop system , 2009, CHI.

[38]  Sergi Jordà,et al.  The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.

[39]  Terry Winograd,et al.  Benefits of merging command selection and direct manipulation , 2005, TCHI.

[40]  Elena Mugellini,et al.  Tangible Meets Gestural: Comparing and Blending Post-WIMP Interaction Paradigms , 2015, Tangible and Embedded Interaction.

[41]  Du Li,et al.  A Regional Undo Mechanism for Text Editing , .

[42]  John F. Hughes,et al.  Multi-finger cursor techniques , 2006, Graphics Interface.

[43]  Raimund Dachselt,et al.  T4 - transparent and translucent tangibles on tabletops , 2014, AVI.

[44]  Patrick Olivier,et al.  SurfaceMouse: supplementing multi-touch interaction with a virtual mouse , 2011, Tangible and Embedded Interaction.

[45]  Hiroshi Ishii,et al.  mediaBlocks: physical containers, transports, and controls for online media , 1998, SIGGRAPH.

[46]  Yvonne Rogers,et al.  Collaboration and interference: awareness with mice or touch input , 2008, CSCW.

[47]  Sarah A. Douglas,et al.  Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard , 1999, CHI '99.

[48]  James A. Landay,et al.  Implications for a gesture design tool , 1999, CHI '99.

[49]  Stefanie Müller,et al.  CapStones and ZebraWidgets: sensing stacks of building blocks, dials and sliders on capacitive touch screens , 2012, CHI.

[50]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[51]  Olivier Bau,et al.  Scale detection for a priori gesture recognition , 2010, CHI.

[52]  Takeo Igarashi,et al.  Bubble clusters: an interface for manipulating spatial aggregation of graphical objects , 2007, UIST.

[53]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[54]  Olivier Bau,et al.  Arpège: learning multitouch chord gestures vocabularies , 2013, ITS.

[55]  Yvonne Rogers,et al.  Evaluating the meaningfulness of icon sets to represent command operations , 1986 .

[56]  Carl Gutwin,et al.  Improving focus targeting in interactive fisheye views , 2002, CHI.

[57]  M. Latash,et al.  Enslaving effects in multi-finger force production , 2000, Experimental Brain Research.

[58]  Philip Tuddenham,et al.  Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks , 2010, CHI.

[59]  Shumin Zhai,et al.  Shorthand writing on stylus keyboard , 2003, CHI '03.

[60]  Ricardo Langner,et al.  Grids & guides: multi-touch layout and alignment tools , 2011, CHI.

[61]  Patrick Baudisch,et al.  Fiberio: a touchscreen that senses fingerprints , 2013, UIST.

[62]  Thomas Berlage,et al.  A selective undo mechanism for graphical user interfaces based on command objects , 1994, TCHI.

[63]  Li-Wei Chan,et al.  GaussStones: shielded magnetic tangibles for multi-token interactions on portable displays , 2014, UIST.

[64]  Shumin Zhai,et al.  Foundational Issues in Touch-Surface Stroke Gesture Design - An Integrative Review , 2012, Found. Trends Hum. Comput. Interact..

[65]  Ravin Balakrishnan,et al.  Simple vs. compound mark hierarchical marking menus , 2004, UIST '04.

[66]  R. Cohen,et al.  Prospective and retrospective effects in human motor control: planning grasps for object rotation and translation , 2011, Psychological research.

[67]  Frédo Durand,et al.  QuickSelect: history-based selection expansion , 2009, Graphics Interface.

[68]  Desney S. Tan,et al.  Phosphor: explaining transitions in the user interface using afterglow effects , 2006, UIST.

[69]  Daniel Vogel,et al.  Hand occlusion on a multi-touch tabletop , 2012, CHI.

[70]  T. Igarashi,et al.  Regional Undo for Spreadsheets , 2004 .

[71]  Hiroshi Ishii,et al.  Tangible Query Interfaces: Physically Constrained Tokens for Manipulating Database Queries , 2003, INTERACT.

[72]  George W. Furnas,et al.  Critical zones in desert fog: aids to multiscale navigation , 1998, UIST '98.

[73]  Andreas Butz,et al.  Sketch-a-TUI: low cost prototyping of tangible interactions using cardboard and conductive ink , 2012, Tangible and Embedded Interaction.

[74]  Carl Gutwin,et al.  Supporting and Exploiting Spatial Memory in User Interfaces , 2013, Found. Trends Hum. Comput. Interact..

[75]  Robert Xiao,et al.  TouchTools: leveraging familiarity and skill with physical tools to augment touch interaction , 2014, CHI.

[76]  Frank Marchak,et al.  Constraints for Action Selection: Overhand Versus Underhand Grips , 2018, Attention and Performance XIII.

[77]  Patrick Baudisch,et al.  Separability of spatial manipulations in multi-touch interfaces , 2009, Graphics Interface.

[78]  Yvonne Rogers,et al.  Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs , 2005, INTERACT.

[79]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[80]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[81]  Brad A. Myers,et al.  Selective Undo Support for Painting Applications , 2015, CHI.

[82]  Jeffrey Heer,et al.  Graphical Histories for Visualization: Supporting Analysis, Communication, and Evaluation , 2008, IEEE Transactions on Visualization and Computer Graphics.

[83]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[84]  George W. Fitzmaurice,et al.  The Hotbox: efficient access to a large number of menu-items , 1999, CHI '99.

[85]  Dario D. Salvucci An integrated model of eye movements and visual encoding , 2001, Cognitive Systems Research.

[86]  Radu-Daniel Vatavu,et al.  Automatic recognition of object size and shape via user-dependent measurements of the grasping hand , 2013, Int. J. Hum. Comput. Stud..

[87]  Olivier Bau,et al.  OctoPocus: a dynamic guide for learning gesture-based command sets , 2008, UIST '08.

[88]  Henry A. Landsberger,et al.  Hawthorne revisited : management and the worker, its critics, and developments in human relations in industry , 1959 .

[89]  Catherine Plaisant,et al.  Navigation patterns and usability of zoomable user interfaces with and without an overview , 2002, TCHI.

[90]  F. Craik,et al.  Levels of Pro-cessing: A Framework for Memory Research , 1975 .

[91]  Atsushi Sugiura,et al.  A user interface using fingerprint recognition: holding commands and data objects on fingers , 1998, UIST '98.

[92]  David M. Lane,et al.  Hidden Costs of Graphical User Interfaces: Failure to Make the Transition from Menus and Icon Toolbars to Keyboard Shortcuts , 2005, Int. J. Hum. Comput. Interact..

[93]  Steven K. Feiner,et al.  Editable graphical histories , 1988, [Proceedings] 1988 IEEE Workshop on Visual Languages.

[94]  M Santello Kinematic synergies for the control of hand shape. , 2002, Archives italiennes de biologie.

[95]  M. Jeannerod,et al.  Constraints on human arm movement trajectories. , 1987, Canadian journal of psychology.

[96]  Pierre Dragicevic,et al.  Combining crossing-based and paper-based interaction paradigms for dragging and dropping between overlapping windows , 2004, UIST '04.

[97]  Jun Rekimoto,et al.  Popup vernier: a tool for sub-pixel-pitch dragging with smooth mode transition , 1998, UIST '98.

[98]  Olivier Chapuis,et al.  UIMarks: quick graphical interaction with specific targets , 2010, UIST.

[99]  Bill N. Schilit,et al.  Dynomite: a dynamically organized ink and audio notebook , 1997, CHI.

[100]  Li-Wei Chan,et al.  GaussBits: magnetic tangible bits for portable and occlusion-free near-surface interactions , 2013, CHI Extended Abstracts.

[101]  Brad A. Myers,et al.  Visualization of fine-grained code change history , 2013, 2013 IEEE Symposium on Visual Languages and Human Centric Computing.

[102]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI '94.

[103]  Xiang 'Anthony' Chen,et al.  The fat thumb: using the thumb's contact size for single-handed mobile interaction , 2012, Mobile HCI.

[104]  Takeo Igarashi,et al.  An application-independent system for visualizing user operation history , 2008, UIST '08.

[105]  Per Ola Kristensson,et al.  Multi-touch rotation gestures: performance and ergonomics , 2013, CHI.

[106]  Desney S. Tan,et al.  InkSeine: In Situ search for active note taking , 2007, CHI.

[107]  Wendy E. Mackay,et al.  Context matters: Evaluating Interaction Techniques with the CIS Model , 2004, BCS HCI.

[108]  James D. Hollan,et al.  SLAP widgets: bridging the gap between virtual and physical controls on tabletops , 2009, CHI.

[109]  Colin Ware,et al.  The DragMag image magnifier , 1995, CHI 95 Conference Companion.

[110]  Andy Cockburn,et al.  Causality: a conceptual model of interaction history , 2014, CHI.

[111]  Patrick Baudisch,et al.  Handle Flags: efficient and flexible selections for inking applications , 2009, Graphics Interface.

[112]  Carl Gutwin,et al.  The effects of interaction technique on coordination in tabletop groupware , 2007, GI '07.

[113]  Kanav Kahol,et al.  The impact on musculoskeletal system during multitouch tablet interactions , 2011, CHI.

[114]  M. Sheelagh T. Carpendale,et al.  Achieving higher magnification in context , 2004, UIST '04.

[115]  Geehyuk Lee,et al.  Force gestures: augmenting touch screen gestures with normal and tangential forces , 2011, UIST.

[116]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[117]  Taylor Francis,et al.  Equal opportunities: Do shareable interfaces promote more group participation than single users displays? , 2009 .

[118]  Ravin Balakrishnan,et al.  Fitts' law and expanding targets: Experimental studies and designs for user interfaces , 2005, TCHI.

[119]  Jun Rekimoto,et al.  Time-machine computing: a time-centric approach for the information environment , 1999, UIST '99.

[120]  Jarke J. van Wijk,et al.  A model for smooth viewing and navigation of large 2D information spaces , 2004, IEEE Transactions on Visualization and Computer Graphics.

[121]  Ravin Balakrishnan,et al.  "Beating" Fitts' law: virtual enhancements for pointing facilitation , 2004, Int. J. Hum. Comput. Stud..

[122]  Wayne Niblack,et al.  A pseudo-distance measure for 2D shapes based on turning angle , 1995, Proceedings., International Conference on Image Processing.

[123]  Ramana Rao,et al.  Laying out and visualizing large trees using a hyperbolic space , 1994, UIST '94.

[124]  Serdar Tasiran,et al.  TreeJuxtaposer: scalable tree comparison using Focus+Context with guaranteed visibility , 2003, ACM Trans. Graph..

[125]  Tracy Anne Hammond,et al.  Object interaction detection using hand posture cues in an office setting , 2011, Int. J. Hum. Comput. Stud..

[126]  G. Caldwell,et al.  From cognition to biomechanics and back: the end-state comfort effect and the middle-is-faster effect. , 1996, Acta psychologica.

[127]  Tovi Grossman,et al.  Chronicle: capture, exploration, and playback of document workflow histories , 2010, UIST.

[128]  Buntarou Shizuki,et al.  Undo/Redo by Trajectory , 2013, HCI.

[129]  B. Buchholz,et al.  Anthropometric data for describing the kinematics of the human hand. , 1992, Ergonomics.

[130]  D. Rosenbaum,et al.  Cognition, action, and object manipulation. , 2012, Psychological bulletin.

[131]  Xiaoyang Mao,et al.  Visualizing histories for selective undo and redo , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[132]  Mark S. Hancock,et al.  Improving Menu Placement Strategies for Pen Input , 2004, Graphics Interface.

[133]  Patrick Baudisch,et al.  Lumino: tangible blocks for tabletop computers based on glass fiber bundles , 2010, CHI.

[134]  Takeo Igarashi,et al.  A temporal model for multi-level undo and redo , 2000, UIST '00.

[135]  Björn Hartmann,et al.  Augmenting interactive tables with mice & keyboards , 2009, UIST '09.

[136]  Timothy S. Miller,et al.  Fluid inking: augmenting the medium of free-form inking with gestures , 2006, Graphics Interface.

[137]  Emmanuel Pietriga,et al.  RouteLens: easy route following for map applications , 2014, AVI.

[138]  O. Herbort Optimal versus heuristic planning of object manipulations: A review and a computational model of the continuous end-state comfort effect , 2013 .

[139]  HeerJeffrey,et al.  D3 Data-Driven Documents , 2011 .

[140]  Akrivi Katifori,et al.  Evaluating the Significance of the Desktop Area in Everyday Computer Use , 2008, First International Conference on Advances in Computer-Human Interaction.

[141]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[142]  Brad A. Myers Scripting graphical applications by demonstration , 1998, CHI.

[143]  Olivier Chapuis,et al.  High-precision magnification lenses , 2010, CHI.

[144]  James R. Rhyne,et al.  The paper-like interface , 1989 .

[145]  Beryl Plimmer,et al.  CapTUI: Geometric Drawing with Tangibles on a Capacitive Multi-touch Display , 2013, INTERACT.

[146]  M. Sheelagh T. Carpendale,et al.  Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits , 2011, CHI.

[147]  Pierre Dragicevic,et al.  Tangible remote controllers for wall-size displays , 2012, CHI.

[148]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[149]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[150]  Jun Rekimoto,et al.  DataTiles: a modular platform for mixed physical and graphical interactions , 2001, CHI.

[151]  Benjamin B. Bederson,et al.  A review of overview+detail, zooming, and focus+context interfaces , 2009, CSUR.

[152]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[153]  Andy Cockburn,et al.  Improving the Acquisition of Small Targets , 2004 .

[154]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[155]  Chunbo Zhou,et al.  Object-based nonlinear undo model , 1997, Proceedings Twenty-First Annual International Computer Software and Applications Conference (COMPSAC'97).

[156]  Takeo Igarashi,et al.  Speed-dependent automatic zooming for browsing large documents , 2000, UIST '00.

[157]  M. Jeannerod The timing of natural prehension movements. , 1984, Journal of motor behavior.

[158]  Desney S. Tan,et al.  Enhancing input on and above the interactive surface with muscle sensing , 2009, ITS '09.

[159]  J. F. Soechting,et al.  Gradual molding of the hand to object contours. , 1998, Journal of neurophysiology.

[160]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[161]  Caroline Appert,et al.  Multi-touch gestures for discrete and continuous control , 2014, AVI.

[162]  M. Sheelagh T. Carpendale,et al.  Fluid integration of rotation and translation , 2005, CHI.

[163]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[164]  Jock D. Mackinlay,et al.  The document lens , 1993, UIST '93.

[165]  Ali Mazalek,et al.  Exploring the design space of gestural interaction with active tokens through user-defined gestures , 2014, CHI.

[166]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[167]  Andy Cockburn,et al.  Human on-line response to visual and motor target expansion , 2006, Graphics Interface.

[168]  Brad A. Myers,et al.  Reusable hierarchical command objects , 1996, CHI.

[169]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[170]  Shumin Zhai,et al.  Modeling human performance of pen stroke gestures , 2007, CHI.

[171]  Emmanuel Pietriga,et al.  An Evaluation of Interactive Map Comparison Techniques , 2015, CHI.

[172]  Shumin Zhai,et al.  Command strokes with and without preview: using pen gestures on keyboard for command selection , 2007, CHI.

[173]  D. Rosenbaum,et al.  Time course of movement planning: selection of handgrips for object manipulation. , 1992, Journal of experimental psychology. Learning, memory, and cognition.

[174]  Olivier Chapuis,et al.  JellyLens: content-aware adaptive lenses , 2012, UIST.

[175]  Ramana Rao,et al.  A focus+context technique based on hyperbolic geometry for visualizing large hierarchies , 1995, CHI '95.

[176]  Martin Volker Butz,et al.  The continuous end-state comfort effect: weighted integration of multiple biases , 2012, Psychological research.

[177]  Robert J. K. Jacob,et al.  Integrality and separability of input devices , 1994, TCHI.

[178]  Mike Y. Chen,et al.  Clip-on gadgets: expanding multi-touch interaction area with unpowered tactile controls , 2011, UIST '11.

[179]  Takeo Igarashi,et al.  Boomerang: suspendable drag-and-drop interactions based on a throw-and-catch metaphor , 2007, UIST.

[180]  Eric Lecolinet,et al.  MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.

[181]  Ravin Balakrishnan,et al.  Pointing lenses: facilitating stylus input through visual-and motor-space magnification , 2007, CHI.

[182]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[183]  Ian Oakley,et al.  MagnID: Tracking Multiple Magnetic Tokens , 2015, Tangible and Embedded Interaction.

[184]  Olivier Bau,et al.  Representation-Independent In-Place Magnification with Sigma Lenses , 2010, IEEE Transactions on Visualization and Computer Graphics.

[185]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[186]  Paul K. Wright,et al.  Toolglasses, Marking Menus, and Hotkeys: A Comparison of One and Two-Handed Command Selection Techniques , 2004, Graphics Interface.

[187]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[188]  Manojit Sarkar,et al.  Graphical fisheye views , 1994, CACM.

[189]  William Buxton,et al.  Contextual Animation of Gestural Commands , 1994, Comput. Graph. Forum.

[190]  Steven P. Reiss,et al.  Stretching the rubber sheet: a metaphor for viewing large layouts on small screens , 1993, UIST '93.

[191]  M. Short,et al.  Precision hypothesis and the end-state comfort effect. , 1999, Acta psychologica.

[192]  Natasa Milic-Frayling,et al.  Materializing the query with facet-streams: a hybrid surface for collaborative search on tabletops , 2011, CHI.

[193]  Emmanuel Pietriga,et al.  Side pressure for bidirectional navigation on small devices , 2013, MobileHCI '13.

[194]  S C Gandevia,et al.  Limits to the control of the human thumb and fingers in flexion and extension. , 2010, Journal of neurophysiology.

[195]  Olivier Chapuis,et al.  DynaSpot: speed-dependent area cursor , 2009, CHI.

[196]  M. Sheelagh T. Carpendale,et al.  Rotation and translation mechanisms for tabletop interaction , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[197]  William Buxton,et al.  Tracking menus , 2003, UIST '03.

[198]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[199]  Catherine G. Wolf CAN PEOPLE USE GESTURE COMMANDS? , 1986, SGCH.

[200]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[201]  Emmanuel Pietriga,et al.  Sigma lenses: focus-context transitions combining space, time and translucence , 2008, CHI.

[202]  Olivier Chapuis,et al.  Controlling widgets with one power-up button , 2013, UIST.

[203]  Halla B. Olafsdottir,et al.  Is the thumb a fifth finger? A study of digit interaction during force production tasks , 2004, Experimental Brain Research.

[204]  Tom Bartindale,et al.  Stacks on the surface: resolving physical order using fiducial markers with structured transparency , 2009, ITS '09.

[205]  Aaron G. Cass,et al.  An empirical evaluation of undo mechanisms , 2006, NordiCHI '06.