Pen + touch = new tools

We describe techniques for direct pen+touch input. We observe people's manual behaviors with physical paper and notebooks. These serve as the foundation for a prototype Microsoft Surface application, centered on note-taking and scrapbooking of materials. Based on our explorations we advocate a division of labor between pen and touch: the pen writes, touch manipulates, and the combination of pen + touch yields new tools. This articulates how our system interprets unimodal pen, unimodal touch, and multimodal pen+touch inputs, respectively. For example, the user can hold a photo and drag off with the pen to create and place a copy; hold a photo and cross it in a freeform path with the pen to slice it in two; or hold selected photos and tap one with the pen to staple them all together. Touch thus unifies object selection with mode switching of the pen, while the muscular tension of holding touch serves as the "glue" that phrases together all the inputs into a unitary multimodal gesture. This helps the UI designer to avoid encumbrances such as physical buttons, persistent modes, or widgets that detract from the user's focus on the workspace.

[1]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[2]  Daniel J. Wigdor,et al.  Combining and Measuring the Benefits of Bimanual Pen and Direct-Touch Interaction of Interfaces , 2008 .

[3]  Volker Roth,et al.  Bezel swipe: conflict-free scrolling and multiple selection on mobile touch screen devices , 2009, CHI.

[4]  Abigail Sellen,et al.  The Prevention of Mode Errors Through Sensory Feedback , 1992, Hum. Comput. Interact..

[5]  Abigail Sellen,et al.  Affordances for manipulation of physical versus digital media on interactive surfaces , 2007, CHI.

[6]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[7]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[8]  Ka-Ping Yee,et al.  Two-handed interaction on a tablet display , 2004, CHI EA '04.

[9]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[10]  Kenton O'Hara,et al.  A diary study of work-related reading: design implications for digital reading devices , 1998, CHI.

[11]  Philip R. Cohen The role of natural language in a multimodal interface , 1992, UIST '92.

[12]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.

[13]  William Buxton,et al.  Digital tape drawing , 1999, UIST '99.

[14]  Yang Li,et al.  Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.

[15]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[16]  William Buxton,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues (Invited Paper) , 1995, IFIP Congress.

[17]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[18]  Bill Buxton,et al.  Sketching User Experiences: Getting the Design Right and the Right Design , 2007 .

[19]  Andrea Bunt,et al.  A model of non-preferred hand mode switching , 2008, Graphics Interface.

[20]  William Buxton,et al.  Issues in combining marking and direct manipulation techniques , 1991, UIST '91.

[21]  Bill Buxton,et al.  38.2: Direct Display Interaction via Simultaneous Pen + Multi‐touch Input , 2010 .

[22]  Olha Bondarenko,et al.  Documents at Hand: Learning from Paper to Improve Digital Technologies , 2005, CHI.

[23]  Björn Hartmann,et al.  Pictionaire: supporting collaborative design work by integrating physical and digital artifacts , 2010, CSCW '10.

[24]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[25]  Lenny Engelhardt P-36: Native Dual Mode Digitizers: Supporting Pen, Touch and Multi-Touch Inputs in One Device on any LCD , 2008 .

[26]  Catherine C. Marshall,et al.  Saving and using encountered information: implications for electronic periodicals , 2005, CHI.

[27]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[28]  Philip L. Davidson,et al.  Extending 2D object arrangement with pressure-sensitive layering cues , 2008, UIST '08.

[29]  Bill N. Schilit,et al.  Beyond paper: supporting active reading with free form digital ink annotations , 1998, CHI.

[30]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[31]  William Buxton,et al.  Usability evaluation considered harmful (some of the time) , 2008, CHI.

[32]  Nicholas Chen,et al.  Navigation techniques for dual-display e-book readers , 2008, CHI.

[33]  interactions Staff,et al.  CHI 2005 , 2005 .

[34]  Nicholas A. Knouf CHI 2009 Extended Abstracts , 2009 .

[35]  Peter Brandl,et al.  Occlusion-aware menu design for digital tabletops , 2009, CHI Extended Abstracts.

[36]  Desney S. Tan,et al.  InkSeine: In Situ search for active note taking , 2007, CHI.

[37]  Marisa E. Campbell CHI 2004 , 2004, INTR.

[38]  Otmar Hilliges,et al.  Bringing physics to the surface , 2008, UIST '08.

[39]  Edward Lank,et al.  A study on the scalability of non-preferred hand mode manipulation , 2007, ICMI '07.

[40]  Joseph J. LaViola,et al.  GestureBar: improving the approachability of gesture-based interfaces , 2009, CHI.

[41]  Alison Kidd,et al.  The marks are on the knowledge worker , 1994, CHI '94.

[42]  Ravin Balakrishnan,et al.  Codex: a dual screen tablet computer , 2009, CHI.