Speed-Dial: A Surrogate Mouse for Non-Visual Web Browsing

Sighted people can browse the Web almost exclusively using a mouse. This is because web browsing mostly entails pointing and clicking on some element in the web page, and these two operations can be done almost instantaneously with a computer mouse. Unfortunately, people with vision impairments cannot use a mouse as it only provides visual feedback through a cursor. Instead, they are forced to go through a slow and tedious process of building a mental map of the web page, relying primarily on a screen reader's keyboard shortcuts and its serial audio readout of the textual content of the page, including metadata. This can often cause content and cognitive overload. This paper describes our Speed-Dial system which uses an off-the-shelf physical Dial as a surrogate for the mouse for non-visual web browsing. Speed-Dial interfaces the physical Dial with the semantic model of a web page, and provides an intuitive and rapid access to the entities and their content in the model, thereby bringing blind people's browsing experience closer to how sighted people perceive and interact with the Web. A user study with blind participants suggests that with Speed-Dial they can quickly move around the web page to select content of interest, akin to pointing and clicking with a mouse.

[1]  I. V. Ramakrishnan,et al.  More than meets the eye: a survey of screen-reader browsing strategies , 2010, W4A.

[2]  Wei-Ying Ma,et al.  VIPS: a Vision-based Page Segmentation Algorithm , 2003 .

[3]  I. V. Ramakrishnan,et al.  Web Screen Reading Automation Assistance Using Semantic Abstraction , 2017, IUI.

[4]  Aaron Allen,et al.  What Frustrates Screen Reader Users on the Web: A Study of 100 Blind Users , 2007, Int. J. Hum. Comput. Interact..

[5]  Anke M. Brock,et al.  Using wrist vibrations to guide hand movement and whole body navigation , 2014, i-com.

[6]  Neel Sundaresan,et al.  A semantic transcoding system to adapt Web services for users with disabilities , 2000, Assets '00.

[7]  Wai Yu,et al.  Haptic access to 3D objects on the web , 2006 .

[8]  Yevgen Borodin,et al.  Widget Classification with Applications to Web Accessibility , 2014, ICWE.

[9]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[10]  Jason Alexander,et al.  HABOS: Towards a platform of haptic-audio based online shopping for the visually impaired , 2015, 2015 IEEE Conference on Open Systems (ICOS).

[11]  I. V. Ramakrishnan,et al.  Hearsay: enabling audio browsing on hypertext content , 2004, WWW '04.

[12]  Martin Pielot,et al.  A Tactile Compass for Eyes-Free Pedestrian Navigation , 2011, INTERACT.

[13]  Thomas Ertl,et al.  A tactile web browser for the visually disabled , 2005, HYPERTEXT '05.

[14]  I. V. Ramakrishnan,et al.  Capti-speak: a speech-enabled web screen reader , 2015, W4A.

[15]  M. Andreessen MCSA Mosaic Technical Summary , 1993 .

[16]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[17]  Megan M. Lawrence,et al.  A Haptic Soundscape Map of the University of Oregon , 2009 .

[18]  I. V. Ramakrishnan,et al.  Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet? , 2017, CHI.

[19]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[20]  Jonathan Walpole,et al.  Spoken-Language Access to Multimedia ( SLAM ) : Masters Thesis David House Advisors : , 2007 .

[21]  Dieter Fensel,et al.  Ontobroker: or how to enable intelligent access to the WWW , 1998 .

[22]  Charlotte Magnusson,et al.  Augmenting the non-visual web browsing process using the geomagic touch haptic device , 2014, ASAC.

[23]  David J. Crandall,et al.  Privacy Concerns and Behaviors of People with Visual Impairments , 2015, CHI.

[24]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[25]  Stephen A. Brewster,et al.  Creating Usable Pin Array Tactons for Nonvisual Information , 2009, IEEE Transactions on Haptics.

[26]  Marcos Serrano,et al.  Quick-glance and in-depth exploration of a tabletop map for visually impaired people , 2014, IHM.

[27]  Shumeet Baluja,et al.  Browsing on small screens: recasting web-page segmentation into an efficient machine learning framework , 2006, WWW '06.

[28]  Ravi Kuber,et al.  Towards developing assistive haptic feedback for visually impaired internet users , 2007, CHI.

[29]  Ravi Kuber,et al.  A novel multimodal interface for improving visually impaired people’s web accessibility , 2005, Virtual Reality.

[30]  Bing Liu,et al.  Web data extraction based on partial tree alignment , 2005, WWW '05.

[31]  Ramanathan V. Guha,et al.  SemTag and seeker: bootstrapping the semantic web via automated semantic annotation , 2003, WWW '03.

[32]  I. V. Ramakrishnan,et al.  Tactile Accessibility: Does Anyone Need a Haptic Glove? , 2016, ASSETS.

[33]  Michael Gertz,et al.  Reverse engineering for Web data: from visual to semantic structures , 2002, Proceedings 18th International Conference on Data Engineering.

[34]  Fidel Cacheda,et al.  Finding and Extracting Data Records from Web Pages , 2007, EUC.

[35]  Wei-Ying Ma,et al.  Simultaneous record detection and attribute labeling in web data extraction , 2006, KDD '06.

[36]  I. V. Ramakrishnan,et al.  Predictive web automation assistant for people with vision impairments , 2013, WWW.

[37]  Marek Nekvasil,et al.  Information Extraction Based on Extraction Ontologies : Design , Deployment and Evaluation , 2008 .

[38]  Robert Hardy,et al.  NaviRadar: a novel tactile information display for pedestrian navigation , 2011, UIST.