Description languages for multimodal interaction: a set of guidelines and its illustration with SMUIML

This article introduces the problem of modeling multimodal interaction, in the form of markup languages. After an analysis of the current state of the art in multimodal interaction description languages, nine guidelines for languages dedicated at multimodal interaction description are introduced, as well as four different roles that such language should target: communication, configuration, teaching and modeling. The article further presents the SMUIML language, our proposed solution to improve the time synchronicity aspect while still fulfilling other guidelines. SMUIML is finally mapped to these guidelines as a way to evaluate their spectrum and to sketch future works.

[1]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[2]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.

[3]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[4]  Denis Lalanne,et al.  Prototyping Multimodal Interfaces with the SMUIML Modeling Language , 2008 .

[5]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[6]  Jean-Yves Lionel Lawson,et al.  The openinterface framework: a tool for multimodal interaction. , 2008, CHI Extended Abstracts.

[7]  Kris Luyten,et al.  User interface description languages for next generation user interfaces , 2008, CHI Extended Abstracts.

[8]  Ivan Marsic,et al.  A framework for rapid development of multimodal interfaces , 2003, ICMI '03.

[9]  Tsuneo Nitta,et al.  XISL: a language for describing multimodal interaction scenarios , 2003, ICMI '03.

[10]  Karin Coninx,et al.  High-Level Modeling of Multimodal Interaction Techniques Using NiMMiT , 2007, J. Virtual Real. Broadcast..

[11]  Fabio Paternò,et al.  Authoring pervasive multimodal user interfaces , 2008, Int. J. Web Eng. Technol..

[12]  Marie-Luce Bourguet,et al.  A Toolkit for Creating and Testing Multimodal Interface Designs , 2002 .

[13]  Masahiro Araki,et al.  Multimodal Dialog Description Language for Rapid System Development , 2006, SIGDIAL Workshop.

[14]  Liang Chen,et al.  QuickSet: Multimodal Interaction for Simulation Set-up and Control , 1997, ANLP.

[15]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.

[16]  John S. Garofolo Overcoming Barriers to Progress in Multimodal Fusion Research , 2008, AAAI Fall Symposium: Multimedia Information Extraction.

[17]  Eric Barboni,et al.  Dealing with Reliability and Evolvability in Description Techniques for Next Generation User Interfaces , 2008 .

[18]  Thierry Ganille,et al.  ICARE software components for rapidly developing multimodal interfaces , 2004, ICMI '04.

[19]  Yijun Yu,et al.  Chasm: A Tiered Developer-Inspired 3D Interface Representation , 2007 .

[20]  Alexander H. Waibel,et al.  Multimodal interfaces , 1996, Artificial Intelligence Review.

[21]  Ann Blandford,et al.  Four easy pieces for assessing the usability of multimodal interaction: the CARE properties , 1995, INTERACT.

[22]  Benjamin Michotte,et al.  A transformational approach for multimodal web user interfaces based on UsiXML , 2005, ICMI '05.

[23]  Saul Greenberg,et al.  Phidgets: easy development of physical interfaces through physical widgets , 2001, UIST '01.

[24]  Marie-Luce Bourguet,et al.  Towards a taxonomy of error-handling strategies in recognition-based multi-modal human-computer interfaces , 2006, Signal Process..