Linespace: A Sensemaking Platform for the Blind

For visually impaired users, making sense of spatial information is difficult as they have to scan and memorize content before being able to analyze it. Even worse, any update to the displayed content invalidates their spatial memory, which can force them to manually rescan the entire display. Making display contents persist, we argue, is thus the highest priority in designing a sensemaking system for the visually impaired. We present a tactile display system designed with this goal in mind. The foundation of our system is a large tactile display (140x100cm, 23x larger than Hyperbraille), which we achieve by using a 3D printer to print raised lines of filament. The system's software then trades in this space in order to minimize screen updates. Instead of panning and zooming, for example, our system creates additional views, leaving display contents intact and thus supporting users in preserving their spatial memory. We illustrate our system and its design principles at the example of four spatial applications. We evaluated our system with six blind users. Participants responded favorably to the system and expressed, for example, that having multiple views at the same time was helpful. They also judged the increased expressiveness of lines over the more traditional dots as useful for encoding information.

[1]  Stephen A. Brewster,et al.  Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users , 2008, TACC.

[2]  Pierre Dragicevic,et al.  Supporting the design and fabrication of physical visualizations , 2014, CHI.

[3]  Amy Hurst,et al.  VizTouch: automatically generated tactile visualizations of coordinate spaces , 2012, TEI.

[4]  Anke M. Brock,et al.  Design and User Satisfaction of Interactive Maps for Visually Impaired People , 2012, ICCHP.

[5]  Christophe Ramstein,et al.  Combining haptic and braille technologies: design issues and pilot study , 1996, Assets '96.

[6]  Roope Raisamo,et al.  The micole architecture: multimodal support for inclusion of visually impaired children , 2007, ICMI '07.

[7]  Helen Petrie,et al.  Haptic virtual reality for blind computer users , 1998, Assets '98.

[8]  Martin Kurze,et al.  TGuide: a guidance system for tactile image exploration , 1998, Assets '98.

[9]  Gerhard Weber,et al.  A tactile windowing system for blind users , 2010, ASSETS '10.

[10]  Amy Hurst,et al.  ABC and 3D: opportunities and obstacles to 3D printing in special education environments , 2014, ASSETS.

[11]  Jeffrey P. Bigham,et al.  Tracking @stemxcomet: teaching programming to blind students via 3D printing, crisis management, and twitter , 2014, SIGCSE.

[12]  Kim Marriott,et al.  Generation of accessible graphics , 2014, 22nd Mediterranean Conference on Control and Automation.

[13]  Desney S. Tan,et al.  Phosphor: explaining transitions in the user interface using afterglow effects , 2006, UIST.

[14]  Richard E. Ladner,et al.  Automated tactile graphics translation: in the field , 2007, Assets '07.

[15]  Deborah Gilden,et al.  Talking TMAP: Automated generation of audio-tactile maps using Smith-Kettlewell's TMAP software , 2006 .

[16]  Beryl Plimmer,et al.  Multimodal collaborative handwriting training for visually-impaired people , 2008, CHI.

[17]  Tom Yeh,et al.  Toward 3D-Printed Movable Tactile Pictures for Children with Visual Impairments , 2015, CHI.

[18]  Martin Kurze,et al.  TDraw: a computer-based tactile drawing tool for blind people , 1996, Assets '96.