Supporting Usability Evaluation of Multimodal Man-Machine Interfaces for Space Ground Segment Applications Using Petri nets Based Formal Specification

This paper describes the issues raised by the evaluation of multimodal interfaces in the field of command and control workstations. Design, specification, verification and certification issues for such Man-Machine Interfaces (MMIs) have been already identified as critical activities. This paper focuses on the issues raised by evaluation of their usability evaluation. We first present a formalism (Interactive Cooperative Objects) and its related case tool (PetShop) for the specification of such MMIs and then show how the models built can support the usability evaluation phase. As a case study we present a multimodal interaction for 3D navigation in a 3D satellite model.

[1]  Shumin Zhai,et al.  Performance evaluation of input devices in trajectory-based tasks: an application of the steering law , 1999, CHI '99.

[2]  Minh Tue Vo,et al.  Building an application framework for speech and pen input integration in multimodal learning interfaces , 1996, 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing Conference Proceedings.

[3]  Joëlle Coutaz,et al.  A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.

[4]  Benjamin B. Bederson,et al.  Jazz: an extensible zoomable user interface graphics toolkit in Java , 2000, UIST '00.

[5]  Luciana Porcher Nedel,et al.  Testing the Use of Egocentric Interactive Techniques in Immersive Virtual Environments , 2003, INTERACT.

[6]  Jo W. Tombaugh,et al.  Measuring the true cost of command selection: techniques and results , 1990, CHI '90.

[7]  Philippe A. Palanque,et al.  A Visual and Formal Glue between Application and Interaction , 1999, J. Vis. Lang. Comput..

[8]  Marco Ajmone Marsan,et al.  Modelling with Generalized Stochastic Petri Nets , 1995, PERV.

[9]  Rainer Malaka,et al.  Multimodal interaction for pedestrians: an evaluation study , 2005, IUI '05.

[10]  Ka-Ping Yee,et al.  Two-handed interaction on a tablet display , 2004, CHI EA '04.

[11]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[12]  Joëlle Coutaz,et al.  A generic platform for addressing the multimodal challenge , 1995, CHI '95.

[13]  Jakob Nielsen,et al.  Usability inspection methods , 1994, CHI 95 Conference Companion.

[14]  Dennis Proffitt,et al.  Two-handed virtual manipulation , 1998, TCHI.

[15]  Grzegorz Rozenberg,et al.  High-level Petri Nets: Theory And Application , 1991 .

[16]  Philippe Palanque,et al.  Multimodal and 3D Graphic Man Machine Interfaces to Improve Operations , 2004 .

[17]  Michel Généreux,et al.  Evaluating Multi-modal Input Modes in a Wizard-of-Oz Study for the Domain of Web Search , 2001, BCS HCI/IHM.

[18]  Hartmann J. Genrich,et al.  Predicate/Transition Nets , 1986, Advances in Petri Nets.

[19]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[20]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.

[21]  Lance Sherry,et al.  When Does the MCDU Interface Work Well? Lessons Learned for the Design of New Flightdeck User-Interfaces , 2002 .

[22]  Fabio Paternò,et al.  Designing And Developing Multi-User, Multi-Device Web Interfaces , 2006, CADUI.

[23]  David Reitter,et al.  The Evaluation of Adaptable Multimodal System Outputs , 2004 .

[24]  Alexander H. Waibel,et al.  Model-based and empirical evaluation of multimodal interactive error correction , 1999, CHI '99.

[25]  Philippe A. Palanque,et al.  A model-based tool for interactive prototyping of highly interactive applications , 2002, CHI Extended Abstracts.

[26]  Philippe A. Palanque,et al.  A Petri Net based Environment for the Design of Event-driven Interfaces , 1995, Application and Theory of Petri Nets.

[27]  Philippe A. Palanque,et al.  An approach integrating two complementary model-based environments for the construction of multimodal interactive applications , 2006, Interact. Comput..

[28]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[29]  Shumin Zhai,et al.  Improving Browsing Performance: A study of four input devices for scrolling and pointing tasks , 1997, INTERACT.

[30]  Dominic W. Massaro,et al.  A framework for evaluating multimodal integration by humans and a role for embodied conversational agents , 2004, ICMI '04.

[31]  Philippe A. Palanque,et al.  Very-High-Fidelity Prototyping for Both Presentation and Dialogue Parts of Multimodal Interactive Systems , 2004, EHCI/DS-VIS.

[32]  Ann Blandford,et al.  Four easy pieces for assessing the usability of multimodal interaction: the CARE properties , 1995, INTERACT.

[33]  Cathleen Wharton,et al.  Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces , 1990, CHI '90.

[34]  Stéphane Chatty,et al.  Extending a graphical toolkit for two-handed interaction , 1994, UIST '94.

[35]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[36]  Mike Sinclair,et al.  Interaction and modeling techniques for desktop two-handed input , 1998, UIST '98.

[37]  Philippe A. Palanque,et al.  A model-based tool for interactive prototyping of highly interactive applications , 2001, Proceedings 12th International Workshop on Rapid System Prototyping. RSP 2001.

[38]  Johnny Accot,et al.  A Formal Description of Low Level Interaction and its Application to Multimodal Interactive Systems , 1996, DSV-IS.

[39]  Philippe A. Palanque,et al.  A model-based approach for real-time embedded multimodal systems in military aircrafts , 2004, ICMI '04.

[40]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[41]  Pierre Dragicevic,et al.  Input Device Selection and Interaction Configuration with ICON , 2001, BCS HCI/IHM.