Evaluating the benefits of multimodal interface design for CoMPASS—a mobile GIS

The context of mobility raises many issues for geospatial applications providing location-based services. Mobile device limitations, such as small user interface footprint and pen input whilst in motion, result in information overload on such devices and interfaces which are difficult to navigate and interact with. This has become a major issue as mobile GIS applications are now being used by a wide group of users, including novice users such as tourists, for whom it is essential to provide easy-to-use applications. Despite this, comparatively little research has been conducted to address the mobility problem. We are particularly concerned with the limited interaction techniques available to users of mobile GIS which play a primary role in contributing to the complexity of using such an application whilst mobile. As such, our research focuses on multimodal interfaces as a means to present users with a wider choice of modalities for interacting with mobile GIS applications. Multimodal interaction is particularly advantageous in a mobile context, enabling users of location-based applications to choose the mode of input that best suits their current task and location. The focus of this article concerns a comprehensive user study which demonstrates the benefits of multimodal interfaces for mobile geospatial applications.

[1]  Alexander H. Waibel,et al.  Multimodal error correction for speech user interfaces , 2001, TCHI.

[2]  R. Malaka,et al.  DEEP MAP: Challenging IT Research In The Framework Of A Tourist Information System , 2000 .

[3]  Andreas Butz,et al.  The connected user interface: realizing a personal situated navigation service , 2004, IUI '04.

[4]  Sharon L. Oviatt,et al.  Multimodal interfaces for dynamic interactive maps , 1996, CHI.

[5]  Suzanne Kieffer,et al.  Oral messages improve visual search , 2006, AVI '06.

[6]  Max J. Egenhofer,et al.  Query Processing in Spatial-Query-by-Sketch , 1997, J. Vis. Lang. Comput..

[7]  T. Landauer,et al.  Handbook of Human-Computer Interaction , 1997 .

[8]  Sven Fuhrmann,et al.  Approaching a New Multimodal GIS-Interface , 2002 .

[9]  Christopher D. Wickens,et al.  Cognitive issues in virtual reality , 1995 .

[10]  Michela Bertolotto,et al.  Personalised maps in multimodal mobile GIS , 2007, Int. J. Web Eng. Technol..

[11]  Woodrow Barfield,et al.  Virtual environments and advanced interface design , 1995 .

[12]  Sven Fuhrmann,et al.  Gesture and Speech-Based Maps to Support Use of GIS for Crisis Management: A User Study , 2005 .

[13]  Michela Bertolotto,et al.  A Survey of Multimodal Interfaces for Mobile Mapping Applications , 2008, Map-based Mobile Services.

[14]  Sharon L. Oviatt,et al.  Mutual disambiguation of recognition errors in a multimodel architecture , 1999, CHI '99.

[15]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[16]  Sharon L. Oviatt,et al.  Taming recognition errors with a multimodal interface , 2000, CACM.

[17]  Michela Bertolotto,et al.  Multimodal Interaction - Improving Usability and Efficiency in a Mobile GIS Context , 2008, First International Conference on Advances in Computer-Human Interaction.

[18]  Vladimir Pavlovic,et al.  Toward multimodal human-computer interface , 1998, Proc. IEEE.