Proton++: a customizable declarative multitouch framework

Proton++ is a declarative multitouch framework that allows developers to describe multitouch gestures as regular expressions of touch event symbols. It builds on the Proton framework by allowing developers to incorporate custom touch attributes directly into the gesture description. These custom attributes increase the expressivity of the gestures, while preserving the benefits of Proton: automatic gesture matching, static analysis of conflict detection, and graphical gesture creation. We demonstrate Proton++'s flexibility with several examples: a direction attribute for describing trajectory, a pinch attribute for detecting when touches move towards one another, a touch area attribute for simulating pressure, an orientation attribute for selecting menu items, and a screen location attribute for simulating hand ID. We also use screen location to simulate user ID and enable simultaneous recognition of gestures by multiple users. In addition, we show how to incorporate timing into Proton++ gestures by reporting touch events at a regular time interval. Finally, we present a user study that suggests that users are roughly four times faster at interpreting gestures written using Proton++ than those written in procedural event-handling code commonly used today.

[1]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[2]  Elisabeth André,et al.  Usage and Recognition of Finger Orientation for Multi-Touch Tabletop Interaction , 2011, INTERACT.

[3]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[4]  Ken Perlin,et al.  The UnMousePad: an interpolating multi-touch force-sensing input pad , 2009, SIGGRAPH 2009.

[5]  Marisa E. Campbell CHI 2002 , 2002, INTR.

[6]  Xiang Cao,et al.  Detecting and leveraging finger orientation for interaction with direct-touch surfaces , 2009, UIST '09.

[7]  Rainer Groh,et al.  Towards a formalization of multi-touch gestures , 2010, ITS '10.

[8]  Tomer Moscovich,et al.  Contact area interaction with sliding widgets , 2009, UIST '09.

[9]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[10]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[11]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[12]  Norbert Schnell,et al.  Continuous Realtime Gesture Following and Recognition , 2009, Gesture Workshop.

[13]  Björn Hartmann,et al.  Two-handed marking menus for multitouch devices , 2011, TCHI.

[14]  Beat Signer,et al.  Midas: a declarative multi-touch interaction framework , 2010, TEI.

[15]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[16]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[17]  Gudrun Klinker,et al.  Hand tracking for enhanced gesture recognition on interactive multi-touch surfaces , 2007 .

[18]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[19]  Dafydd Gibbon,et al.  A computational model of arm gestures in conversation , 2003, INTERSPEECH.

[20]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[21]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[22]  Shumin Zhai,et al.  Using strokes as command shortcuts: cognitive benefits and toolkit support , 2009, CHI.

[23]  Marisa E. Campbell CHI 2004 , 2004, INTR.

[24]  Otmar Hilliges,et al.  Bringing physics to the surface , 2008, UIST '08.

[25]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[26]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[27]  Frank Maurer,et al.  A domain specific language to define gestures for multi-touch applications , 2010, DSM '10.

[28]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[29]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[30]  William M. Newman,et al.  A system for interactive graphical programming , 1968, AFIPS Spring Joint Computing Conference.

[31]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[32]  Dan R. Olsen,et al.  SYNGRAPH: A graphical user interface generator , 1983, SIGGRAPH.

[33]  Tovi Grossman,et al.  The design and evaluation of multitouch marking menus , 2010, CHI.

[34]  Alan Esenther,et al.  Multi-user Multi-touch Games on DiamondTouch with the DTFlash Toolkit , 2005, INTETAIN.

[35]  Tony DeRose,et al.  Proton: multitouch gestures as regular expressions , 2012, CHI.

[36]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.