Proton: multitouch gestures as regular expressions

Current multitouch frameworks require application developers to write recognition code for custom gestures; this code is split across multiple event-handling callbacks. As the number of custom gestures grows it becomes increasingly difficult to 1) know if new gestures will conflict with existing gestures, and 2) know how to extend existing code to reliably recognize the complete gesture set. Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time. Proton contributes a graphical editor for authoring tablatures and automatically compiles tablatures into regular expressions. We present the architecture and implementation of Proton, along with three proof-of-concept applications. These applications demonstrate the expressiveness of the framework and show how Proton simplifies gesture definition and conflict resolution.

[1]  Dan R. Olsen,et al.  SYNGRAPH: A graphical user interface generator , 1983, SIGGRAPH.

[2]  interactions Staff,et al.  CHI 2005 , 2005 .

[3]  Dafydd Gibbon,et al.  A computational model of arm gestures in conversation , 2003, INTERSPEECH.

[4]  William M. Newman,et al.  A system for interactive graphical programming , 1968, AFIPS Spring Joint Computing Conference.

[5]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[6]  Janusz A. Brzozowski,et al.  Derivatives of Regular Expressions , 1964, JACM.

[7]  Scott E. Hudson,et al.  Extensible input handling in the subArctic toolkit , 2005, CHI.

[8]  Caroline Appert,et al.  SwingStates: adding state machines to the swing toolkit , 2006, UIST.

[9]  Robert J.K. Jacob Executable specifications for a human-computer interface , 1983, CHI '83.

[10]  Robert J. K. Jacob,et al.  A software model and specification language for non-WIMP user interfaces , 1999, TCHI.

[11]  Gregory D. Abowd,et al.  Interaction techniques for ambiguity resolution in recognition-based interfaces , 2007, SIGGRAPH '07.

[12]  Eitan M. Gurari,et al.  Introduction to the theory of computation , 1989 .

[13]  D. Kammer,et al.  Taxonomy and Overview of Multi-touch Frameworks: Architecture, Scope and Features , 2010 .

[14]  Robert J. K. Jacob,et al.  A Specification Language for Direct-Manipulation User Interfaces , 1986, ACM Trans. Graph..

[15]  Tony DeRose,et al.  Eden: a professional multitouch tool for constructing virtual organic environments , 2011, CHI.

[16]  Frank Maurer,et al.  A domain specific language to define gestures for multi-touch applications , 2010, DSM '10.

[17]  Ken Thompson,et al.  Programming Techniques: Regular expression search algorithm , 1968, Commun. ACM.

[18]  Brad A. Myers A new model for handling input , 1990, TOIS.

[19]  Beat Signer,et al.  iGesture: A General Gesture Recognition Framework , 2007, Ninth International Conference on Document Analysis and Recognition (ICDAR 2007).

[20]  Dan Olsen Building Interactive Systems: Principles for Human-Computer Interaction , 2009 .

[21]  Alan F. Blackwell,et al.  SWYN: a visual representation for regular expressions , 2001 .

[22]  Gudrun Klinker,et al.  A multitouch software architecture , 2008, NordiCHI.

[23]  Rainer Groh,et al.  Towards a formalization of multi-touch gestures , 2010, ITS '10.

[24]  Songyang Lao,et al.  A gestural interaction design model for multi-touch displays , 2009, BCS HCI.

[25]  James D. Foley,et al.  Towards specifying and evaluating the human factors of user-computer interfaces , 1982, CHI '82.

[26]  Orit Shaer,et al.  A specification paradigm for the design and implementation of tangible user interfaces , 2009, TCHI.

[27]  Juan Pablo Hourcade,et al.  PyMT: a post-WIMP multi-touch user interface toolkit , 2009, ITS '09.

[28]  Scott E. Hudson,et al.  A framework for robust and flexible handling of inputs with uncertainty , 2010, UIST.

[29]  Tyson R. Henry,et al.  Integrating gesture and snapping into a user interface toolkit , 1990, UIST '90.

[30]  Bell Telephone,et al.  Regular Expression Search Algorithm , 1968 .

[31]  Marvin Theimer,et al.  Cooperative Task Management Without Manual Stack Management , 2002, USENIX Annual Technical Conference, General Track.

[32]  Beat Signer,et al.  Midas: a declarative multi-touch interaction framework , 2010, TEI.

[33]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[34]  Jennifer Mankoff,et al.  Providing integrated toolkit-level support for ambiguity in recognition-based interfaces , 2000, CHI Extended Abstracts.