An innovative framework to support multimodal interaction with Smart Environments

Highlights? We outline an original approach to the design of multimodal applications (MA). ? A dialogue model provides a robust support layer for multimodal interaction. ? The approach is valid for integration on today's devices. ? The approach is flexible in view of implementation of MA in future Smart Environments. ? We discuss a sample application to validate the approach, with implementation details. Interaction with future coming Smart Environments requires research on methods for the design of a new generation of human-environment interfaces. The paper outlines an original approach to the design of multimodal applications that, while valid for the integration on today's devices, aims also to be sufficiently flexible so as to remain consistent in view of the transition to future Smart Environments, which will likely be structured in a more complex manner, requiring that interaction with services offered by the environment is made available through the integration of multimodal/unimodal interfaces provided through objects of everyday use. In line with the most recent research tendencies, the approach is centred not only on the user interface part of a system, but on the design of a comprehensive solution, including a dialogue model which is meant to provide a robust support layer on which multimodal interaction builds upon. Specific characteristics of the approach and of a sample application being developed to validate it are discussed in the paper, along with some implementation details.

[1]  Sharon L. Oviatt,et al.  Flexible and robust multimodal interfaces for universal access , 2003, Universal Access in the Information Society.

[2]  Jan Newmarch,et al.  Speech interface: an enhancer to the acceptance of m-commerce applications , 2005, International Conference on Mobile Business (ICMB'05).

[3]  Nasa,et al.  A metamodel for the runtime architecture of an interactive system: the UIMS tool developers workshop , 1992, SGCH.

[4]  Pier Luigi Emiliani,et al.  Design for All in action: An example of analysis and implementation , 2009, Expert Syst. Appl..

[5]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[6]  Fabio Paternò,et al.  Design and development of multidevice user interfaces through multiple logical descriptions , 2004, IEEE Transactions on Software Engineering.

[7]  Abderrahmane Kheddar,et al.  Tactile interfaces: a state-of-the-art survey , 2004 .

[8]  Noëlle Carbonell,et al.  Ambient multimodality: towards advancing computer accessibility and assisted living , 2006, Universal Access in the Information Society.

[9]  Steve J. Young,et al.  Cognitive User Interfaces , 2010, IEEE Signal Processing Magazine.

[10]  Jacob Eisenstein,et al.  Towards a general computational framework for model-based interface development systems , 1998, IUI '99.

[11]  Sharon Oviatt,et al.  User-centered modeling and evaluation of multimodal interfaces , 2003, Proc. IEEE.

[12]  David Garlan,et al.  Context is key , 2005, CACM.

[13]  Luís Carriço,et al.  A conceptual framework for developing adaptive multimodal applications , 2006, IUI '06.

[14]  K. Ducatel,et al.  Scenarios for Ambient Intelligence in 2010 Final Report , 2001 .

[15]  Pier Luigi Emiliani,et al.  Ambient Intelligence and Multimodality , 2007, HCI.

[16]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Noëlle Carbonell Multimodal Interfaces - A Generic Design Approach , 2005, Universal Access in Health Telematics.

[18]  James L. Crowley,et al.  Perceptual Components for Context Aware Computing , 2002, UbiComp.

[19]  Bruce Bukovics Pro WF: Windows Workflow in NET 3.5 , 2007 .

[20]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[21]  Randy Allen Harris,et al.  Voice Interaction Design: Crafting the New Conversational Speech Systems , 2004 .

[22]  Mark T. Maybury,et al.  Intelligent multimedia interfaces , 1994, CHI Conference Companion.

[23]  Wolfgang Wahlster,et al.  SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies) , 2006 .

[24]  Roman Vilimek,et al.  Multimodal Interfaces for In-Vehicle Applications , 2007, HCI.

[25]  Daniel Gepner,et al.  Gaze as a Supplementary Modality for Interacting with Ambient Intelligence Environments , 2007, HCI.

[26]  A. Ghiroldi,et al.  Ambient Intelligence-from vision to reality , 2003 .

[27]  Mark T. Maybury Universal multimedia information access , 2003, Universal Access in the Information Society.

[28]  Wolfgang Wahlster,et al.  SmartKom: Foundations of Multimodal Dialogue Systems , 2006, SmartKom.

[29]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[30]  Norbert Reithinger,et al.  The SmartKom Architecture: A Framework for Multimodal Dialogue Systems , 2006, SmartKom.

[31]  Jean Vanderdonckt,et al.  Solving the Mapping Problem in User Interface Design by Seamless Integration in IdealXML , 2005, DSV-IS.

[32]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[33]  Margrit Betke,et al.  Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.