Pen and touch gestural environment for document editing on interactive tabletops

Combined pen and touch input is an interaction paradigm attracting increasing interest both in the research community and recently in industry. In this paper, we illustrate how pen and touch interaction techniques can be leveraged for editing and authoring of presentational documents on digital tabletops. Our system exploits the rich interactional vocabulary afforded by the simultaneous availability of the two modalities to provide gesture-driven document editing functionality as an expert alternative to widgets. For our bimanual gestures, we make use of non-dominant hand postures to set pen modes in which the dominant hand articulates a variety of transactions. We draw an analogy between such modifier postures and modifier keys on a keyboard to construct command shortcuts. Based on this model, we implement a number of common document editing operations, including several page and element manipulations, shape and text input with styling, clipart retrieval and insertion as well as undo/redo. The results of a lab study provide in-sights as to the strengths and limitations of our approach.

[1]  Orit Shaer,et al.  Reality-based interaction: a framework for post-WIMP interfaces , 2008, CHI.

[2]  Fabrice Matulic,et al.  Gesture-supported document creation on pen and touch tabletops , 2013, CHI Extended Abstracts.

[3]  Bill N. Schilit,et al.  Beyond paper: supporting active reading with free form digital ink annotations , 1998, CHI.

[4]  Xiaojun Bi,et al.  Informal information gathering techniques for active reading , 2012, CHI.

[5]  Ricardo Langner,et al.  Neat: a set of flexible tools and gestures for layout tasks on interactive displays , 2011, ITS '11.

[6]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[7]  Robert C. Zeleznik,et al.  Hands-on math: a page-based multi-touch and pen desktop for technical work and problem solving , 2010, UIST.

[8]  Fabrice Matulic SmartPublisher: document creation on pen-based systems via document element reuse , 2006, DocEng '06.

[9]  Paul Johns,et al.  Understanding Pen and Touch Interaction for Data Exploration on Interactive Whiteboards , 2012, IEEE Transactions on Visualization and Computer Graphics.

[10]  W. Keith Edwards,et al.  LiquidText: a flexible, multitouch environment to support active reading , 2011, CHI.

[11]  Jon Trinder,et al.  The Humane Interface: New Directions for Designing Interactive Systems , 2002, Interact. Learn. Environ..

[12]  Andruid Kerne,et al.  High-performance pen + touch modality interactions: a real-time strategy game eSports context , 2012, UIST.

[13]  Fabrice Matulic,et al.  Supporting active reading on pen and touch-operated tabletops , 2012, AVI.

[14]  Fabrice Matulic,et al.  Empirical evaluation of uni- and bimodal pen and touch interaction properties on digital tabletops , 2012, ITS '12.

[15]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[16]  John F. Hughes,et al.  Navigating documents with the virtual scroll ring , 2004, UIST '04.

[17]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.

[18]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[19]  Raimund Dachselt,et al.  Diagram Editing on Interactive Displays Using Multi-touch and Pen Gestures , 2010, Diagrams.

[20]  Heiko Schuldt,et al.  SKETCHify - An Adaptive Prominent Edge Detection Algorithm for Optimized Query-by-Sketch Image Retrieval , 2012, Adaptive Multimedia Retrieval.

[21]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[22]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[23]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[24]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[25]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[26]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[27]  Yang Li,et al.  Sketching informal presentations , 2003, ICMI '03.

[28]  Abigail Sellen,et al.  The Prevention of Mode Errors Through Sensory Feedback , 1992, Hum. Comput. Interact..