Automatic, intuitive zooming for people who are blind or visually impaired.

In this paper we present a novel technique of automatic, "intuitive" zooming of graphical information for individuals who are blind or visually impaired. The idea is to automatically choose for the user only zoom levels with significantly different content than the last and which preserve the cognitive grouping of information (such as whole objects or whole object parts) thereby making "intuitive" sense. The algorithm uses wavelet analysis to localize the details of a graphic. It then uses methods looking at the clustering of details to decide on the levels of zoom. An initial pilot study is presented that uses this zooming method with three individuals who are visually impaired and four who are sighted. Results show that all participants liked intuitive zooming over areas of details and being prevented from zooming when no details were present. Almost all participants required zooming to perform the identification task, which had an 86% correct rate.

[1]  Charles Lenay,et al.  Design of a Haptic Zoom: levels and steps , 2007, Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC'07).

[2]  Ravi Rastogi,et al.  Issues of Using Tactile Mice by Individuals Who Are Blind and Visually Impaired , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  Charlotte Magnusson,et al.  Non-visual zoom and scrolling operations in a virtual haptic environment , 2003 .

[4]  Jing-Yu Yang,et al.  Tracing boundary contours in a binary image , 2002, Image Vis. Comput..

[5]  John Kenneth Salisbury,et al.  Large haptic topographic maps: marsview and the proxy graph algorithm , 2003, I3D '03.

[6]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .