Proliferation of novel types of gesture-based user interfaces has led to considerable fragmentation, both in terms of program code and in terms of the gestures themselves. Consequently, it is difficult for developers to build on previous work, thereby consuming valuable development time. Moreover, the flexibility of the resulting user interface is limited, particularly in respect to users wishing to customize the interface. To address this problem, we present a generic and extensible formal language to describe gestures. This language is applicable to a wide variety of input devices, such as multi-touch surfaces, pen-based input, tangible objects and even free-hand gestures. It enables the development of a generic gesture recognition engine which can serve as a backend to a wide variety of user interfaces. Moreover, rapid customization of the interface becomes possible by simply swapping gesture definitions - an aspect which has considerable advantages when conducting UI research or porting an existing application to a new type of input device. Developers will be able to benefit from the reduced amount of code, while users will be able to benefit from the increased flexibility through customization afforded by this approach.
[1]
Hiroshi Ishii,et al.
Bricks: laying the foundations for graspable user interfaces
,
1995,
CHI '95.
[2]
Meredith Ringel Morris,et al.
DiamondSpin: an extensible toolkit for around-the-table interaction
,
2004,
CHI.
[3]
Scott R. Klemmer,et al.
Papier-Mache: toolkit support for tangible input
,
2004,
CHI.
[4]
Gudrun Klinker,et al.
A multitouch software architecture
,
2008,
NordiCHI.
[5]
Randall Davis,et al.
LADDER, a sketching language for user interface developers
,
2005,
Comput. Graph..
[6]
Enrico Costanza,et al.
TUIO: A Protocol for Table-Top Tangible User Interfaces
,
2005
.
[7]
Saul Greenberg,et al.
Phidgets: easy development of physical interfaces through physical widgets
,
2001,
UIST '01.
[8]
Juan Pablo Hourcade,et al.
PyMT: a post-WIMP multi-touch user interface toolkit
,
2009,
ITS '09.
[9]
Yang Li,et al.
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
,
2007,
UIST.