Design of human-centric adaptive multimodal interfaces

Multimodal interfaces have attracted more and more attention. Most researches focus on each interaction mode independently and then fuse information at the application level. Recently, several frameworks and models have been proposed to support the design and development of multimodal interfaces. However, it is challenging to provide automatic modality adaptation in multimodal interfaces. Existing approaches are using rule-based specifications to define the adaptation of input/output modalities. Rule-based specifications have the problems of completeness and coherence. Distinct from previous work, this paper presents a novel approach that quantifies the user preference of each modality and considers the adaptation as an optimization issue that searches for a set of input/output modalities matching user's preference. Our approach applies a cross-layer design, which considers the adaptation from the perspectives of the interaction context, available system resources, and QoS requirements. Furthermore, our approach supports human-centric adaptation. A user can report the preference of a modality so that selected modalities fit user's personal needs. An optimal solution and a heuristic algorithm have been developed to automatically select an appropriate set of modality combinations under a specific situation. We have designed a framework based on the heuristic algorithm and existing ontology, and applied the framework to conduct a utility evaluation, in which we have employed a within-subject experiment. Fifty participants were invited to go through three scenarios and compare automatically selected modalities with randomly selected modalities. The results from the experiment show that users perceived the automatically selected modalities as appropriate and satisfactory.

[1]  Yacine Bellik,et al.  Multimodal output specification / simulation platform , 2005, ICMI '05.

[2]  Albrecht Schmidt,et al.  Advanced Interaction in Context , 1999, HUC.

[3]  Philip R. Cohen,et al.  Multimodal redundancy across handwriting and speech during computer mediated human-human interactions , 2007, CHI.

[4]  Ann Blandford,et al.  Four easy pieces for assessing the usability of multimodal interaction: the CARE properties , 1995, INTERACT.

[5]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[6]  Trevor Darrell,et al.  MULTIMODAL INTERFACES THAT Flex, Adapt, and Persist , 2004 .

[7]  Arunabha Sen,et al.  Finding a Path Subject to Many Additive QoS Constraints , 2007, IEEE/ACM Transactions on Networking.

[8]  Richard J. Anderson,et al.  A study of digital ink in lecture presentation , 2004, CHI.

[9]  Antonella De Angeli,et al.  Integration and synchronization of input modes during multimodal human-computer interaction , 1997, CHI.

[10]  Julio Abascal,et al.  Universal accessibility as a multimodal design issue , 2007, CACM.

[11]  Dari Trendafilov,et al.  Designing and evaluating multimodal interaction for mobile contexts , 2008, ICMI '08.

[12]  Gregory D. Abowd,et al.  Context-awareness in wearable and ubiquitous computing , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[13]  Thierry Ganille,et al.  ICARE software components for rapidly developing multimodal interfaces , 2004, ICMI '04.

[14]  Pierre Dragicevic,et al.  The Input Configurator toolkit: towards high input adaptability in interactive applications , 2004, AVI.

[15]  Sharon L. Oviatt,et al.  Breaking the Robustness Barrier: Recent Progress on the Design of Robust Multimodal Systems , 2002, Adv. Comput..

[16]  David S. Johnson,et al.  Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .

[17]  Luís Carriço,et al.  Defining scenarios for mobile design and evaluation , 2008, CHI Extended Abstracts.

[18]  Boris Brandherm,et al.  Gumo - The General User Model Ontology , 2005, User Modeling.

[19]  Koen De Bosschere,et al.  Towards an Extensible Context Ontology for Ambient Intelligence , 2004, EUSAI.

[20]  Andrew Sears,et al.  Physical disabilities and computing technologies: an analysis of impairments , 2002 .

[21]  Jun Kong,et al.  A Cross-Layer Design for Adaptive Multimodal Interfaces in Pervasive Computing , 2010, SEKE.

[22]  Maria Roussou,et al.  A versatile large-scale multimodal VR system for cultural heritage visualization , 2006, VRST '06.

[23]  Jeremy R. Cooperstock,et al.  Tech-note: Multimodal feedback in 3D target acquisition , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[24]  Antonio Iera,et al.  Introducing the Adaptive-QoS Idea into Multi-Tier UMTS Systems: The Lessons Learned , 2003, Wirel. Pers. Commun..

[25]  Cecilia Mascolo,et al.  CARISMA: Context-Aware Reflective mIddleware System for Mobile Applications , 2003, IEEE Trans. Software Eng..

[26]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[27]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[28]  Marwan Krunz,et al.  A randomized algorithm for finding a path subject to multiple QoS requirements , 2001, Comput. Networks.

[29]  Richard J. Anderson,et al.  Speech, ink, and slides: the interaction of content channels , 2004, MULTIMEDIA '04.

[30]  Flavio Prieto,et al.  Multi-modal exploration of small artifacts: an exhibition at the Gold Museum in Bogota , 2009, VRST '09.

[31]  Antti Oulasvirta,et al.  Understanding contexts by being there: case studies in bodystorming , 2003, Personal and Ubiquitous Computing.

[32]  Chris Baber,et al.  Using critical path analysis to model multimodal human-computer interaction , 2001, Int. J. Hum. Comput. Stud..

[33]  Mary Shaw,et al.  Dynamic configuration of resource-aware services , 2004, Proceedings. 26th International Conference on Software Engineering.

[34]  Ying Liu,et al.  Environment Analysis as a Basis for Designing Multimodal and Multidevice User Interfaces , 2010, Hum. Comput. Interact..

[35]  Andreas Krause,et al.  SenSay: a context-aware mobile phone , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[36]  Ivan Marsic,et al.  A framework for rapid development of multimodal interfaces , 2003, ICMI '03.

[37]  Krzysztof Z. Gajos,et al.  Decision-Theoretic User Interface Generation , 2008, AAAI.

[38]  Antti Oulasvirta,et al.  Understanding Mobile Contexts , 2003, Mobile HCI.

[39]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[40]  Ioanna D. Constantiou,et al.  Socially targeted mobile services: towards an upper level ontology of social roles for mobile environments , 2006, ECIS.

[41]  Walter F. Tichy,et al.  Proceedings 25th International Conference on Software Engineering , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[42]  Claudia Linnhoff-Popien,et al.  CoOL: A Context Ontology Language to Enable Contextual Interoperability , 2003, DAIS.

[43]  Luís Carriço,et al.  A conceptual framework for developing adaptive multimodal applications , 2006, IUI '06.