Unifying Events from Multiple Devices for Interpreting User Intentions through Natural Gestures

As technology evolves (e.g. 3D cameras, accelerometers, multitouch surfaces, etc.) new gestural interaction methods are becoming part of the everyday use of computational devices. This trend forces practitioners to develop applications for each interaction method individually. This paper tackles the problem of interpreting gestures in a multiple ways of interaction scenario, by focusing on the abstract gesture rather than on the technology or technologies used to generate it. This article describes the Flash Library for Interpreting Natural Gestures (FLING), a framework for developing multigestural applications integrated and running in different gestural-platforms. By offering an architecture for the integration and unification of different types of interaction, FLING eases scalability while presenting an environment for rapid prototyping by novice multi-gestural programmers. Throughout the article we analyse the benefits of this approach, comparing it with state of the art technologies, describe the framework architecture, and present several examples of applications and experiences of use.

[1]  Volker Gruhn,et al.  Write Once, Run Anywhere A Survey of Mobile Runtime Environments , 2008, 2008 The 3rd International Conference on Grid and Pervasive Computing - Workshops.

[2]  Brian R. Gaines,et al.  A learning model for forecasting the future of information technology , 1986 .

[3]  Kathy Ryall,et al.  DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications , 2002 .

[4]  Zhen Liu,et al.  A Robust Blob Recognition and Tracking Method in Vision-Based Multi-touch Technique , 2008, 2008 IEEE International Symposium on Parallel and Distributed Processing with Applications.

[5]  Pierre Dragicevic,et al.  Input Device Selection and Interaction Configuration with ICON , 2001, BCS HCI/IHM.

[6]  Jean-Yves Lionel Lawson,et al.  The openinterface framework: a tool for multimodal interaction. , 2008, CHI Extended Abstracts.

[7]  Gaetano Borriello,et al.  MobileSense - Sensing Modes of Transportation in Studies of the Built Environment , 2008 .

[8]  Jan Zibuschka,et al.  MT4j - A Cross-platform Multi-touch Development Framework , 2010, ArXiv.

[9]  Gudrun Klinker,et al.  A multitouch software architecture , 2008, NordiCHI.

[10]  Ivan Marsic,et al.  A framework for rapid development of multimodal interfaces , 2003, ICMI '03.

[11]  Enrico Costanza,et al.  TUIO: A Protocol for Table-Top Tangible User Interfaces , 2005 .

[12]  Benjamin B. Bederson,et al.  Toolkit design for interactive structured graphics , 2004, IEEE Transactions on Software Engineering.

[13]  Beat Signer,et al.  Midas: a declarative multi-touch interaction framework , 2010, TEI.

[14]  Ross Bencina,et al.  The Design and Evolution of Fiducials for the reacTIVision System , 2005 .

[15]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[16]  Stephen B. Gilbert,et al.  Sparsh UI: A Multi-Touch Framework for Collaboration and Modular Gesture Recognition , 2009 .

[17]  Roman Rädle,et al.  Squidy: a zoomable design environment for natural user interfaces , 2009, CHI Extended Abstracts.

[18]  Michael Haller,et al.  LightTracker: An Open-Source Multitouch Toolkit , 2010, CIE.

[19]  Sergi Jordà,et al.  The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.

[20]  Juan Pablo Hourcade,et al.  PyMT: a post-WIMP multi-touch user interface toolkit , 2009, ITS '09.