Multimodal eyes-free exploration of maps: TIKISI for maps

Touch It, Key It, Speak It (Tikisi) is a software framework for accessible exploration of graphical information by vision impaired users. Multimodal input to Tikisi is through multitouch gestures, keystrokes, and spoken commands; output is generated speech. The key insight in Tikisi is the decoupling of input and output resolutions achieved by a virtual, variable-resolution grid overlaid on an application, which supports touch-based exploration of graphics at different levels of granularity. Using Tikisi For Maps, a vision impaired user can run a finger over a geographical map and issue commands to center, rescale, or zoom the map; to go to specific locations such as cities or states; to find features such as water/land boundaries; and to summarize contextual spatial information at a location. In this paper we describe the architecture and implementation of Tikisi and the capabilities of Tikisi For Maps. We discuss the results of a preliminary formative usability study carried out on the use of the system.

[1]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[2]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[3]  Anke M. Brock,et al.  Usage of multimodal maps for blind people: why and how , 2010, ITS '10.

[4]  Sina Bahram,et al.  CAVIAR: a vibrotactile device for accessible reaching , 2012, IUI '12.

[5]  John A. Gardner,et al.  Scientific Diagrams Made Easy with IVEOTM , 2006, ICCHP.

[6]  Sharon L. Oviatt,et al.  Multimodal interfaces for dynamic interactive maps , 1996, CHI.

[7]  DAVID R. HILL,et al.  Substitution for a restricted visual channel in multimodal computer-human dialogue , 1988, IEEE Trans. Syst. Man Cybern..

[8]  Gregg C. Vanderheiden,et al.  Use of Audio-Haptic Interface Techniques to Allow Nonvisual Access to Touchscreen Appliances , 1996 .

[9]  Haig A. Rushdoony Achievement in Map-Reading: An Experimental Study , 1963, The Elementary School Journal.

[10]  John A. Gardner,et al.  Scientific diagrams made easy with IVEO , 2006 .

[11]  R. Dan Jacobson Navigating maps with little or no sight: An audio-tactile approach , 1998 .

[12]  Paul Blenkhorn,et al.  Java-Powered Braille Slate Talker , 2004, ICCHP.

[13]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.

[14]  Richard E. Ladner,et al.  Access overlays: improving non-visual access to large touch screens for blind users , 2011, UIST.

[15]  M. Rieger,et al.  Enhancing spatial learning and mobility training of visually impaired people—a technical paper on the Internet‐based tactile and audio‐tactile mapping , 2003 .

[16]  C. Perkins,et al.  Real World Map Reading Strategies , 2003 .

[17]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.