Recognizing Hand-drawn Glyphs from One Example and Four Lines of Code

The biggest challenge in the development of gesturebased user interfaces is the creation of a gesture recognizer. Existing approaches to support high-level recognition of glyphs require a lot of effort from developers, are error prone, and suffer from low recognition rates. We propose a tool that generates a recognizer for hand-drawn glyphs from one example. Our tool uses the output of a basic shape recognizer as input to the glyph recognition. The recognizer can be integrated into an app by adding only four lines of code. By reducing the development effort required, the approach makes it possible for many touch-interaction apps to take advantage of hand-drawn content. We demonstrate the tools effectiveness with two examples. Furthermore, our within-subject evaluation shows that programmers with no knowledge of gesture recognition can generate a recognizer and integrate it into an app more quickly and easily than manually coding recognition rules, and that the generated recognizer is more accurate than a manually coded one.

[1]  Josep Lladós,et al.  A syntactic approach based on distortion-tolerant Adjacency Grammars and a spatial-directed parser to interpret sketched diagrams , 2010, Pattern Recognit..

[2]  Randall Davis,et al.  LADDER, a sketching language for user interface developers , 2005, Comput. Graph..

[3]  Thomas P. Moran,et al.  Pen-based interaction techniques for organizing material on an electronic whiteboard , 1997, UIST '97.

[4]  Stuart Russell,et al.  Statistical Visual Language Models for Ink Parsing , 2002 .

[5]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[6]  Florian Brieler,et al.  A model-based recognition engine for sketched diagrams , 2010, J. Vis. Lang. Comput..

[7]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[8]  M. Egenhofer,et al.  Point-Set Topological Spatial Relations , 2001 .

[9]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[10]  James A. Landay Interactive sketching for user interface design , 1995, CHI '95.

[11]  Jason Hong,et al.  Computational Support for Sketching in Design: A Review , 2009, Found. Trends Hum. Comput. Interact..

[12]  Josep Lladós,et al.  An Adjacency Grammar to Recognize Symbols and Gestures in a Digital Pen Framework , 2005, IbPRIA.

[13]  Michele Risi,et al.  A trainable system for recognizing diagrammatic sketch languages , 2005, 2005 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC'05).

[14]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[15]  Michiel van de Panne,et al.  Constellation models for sketch recognition , 2006, SBM'06.

[16]  M. Egenhofer Categorizing Binary Topological Relations Between Regions, Lines, and Points in Geographic Databases , 1998 .

[17]  Beryl Plimmer,et al.  Software for Students to Sketch Interface Designs , 2003, INTERACT.

[18]  Danilo Avola,et al.  A Framework for Designing and Recognizing Sketch-Based Libraries for Pervasive Systems , 2008, UNISCON.

[19]  Randall Davis,et al.  Interactive learning of structural shape descriptions from automatically generated near-miss examples , 2006, IUI '06.

[20]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[21]  Beryl Plimmer,et al.  RATA.Gesture: A gesture recognizer developed using data mining , 2012, Artificial Intelligence for Engineering Design, Analysis and Manufacturing.

[22]  Michele Risi,et al.  Sketch Grammars: a formalism for describing and recognizing diagrammatic sketch languages , 2005, Eighth International Conference on Document Analysis and Recognition (ICDAR'05).