Accessibility layers: levelling the field for blind people in mobile social contexts

Enabling access to a computing device is likely to have a huge impact in the quality of life of a person. Every once in a while, new technologies are devised and impact the way we communicate, work and even, how we have fun. Paradigmatically, it is often the case that a new technology empowers the general able-bodied user and fosters exclusion of people with disabilities. The emergence of touch-based smartphones as the de facto mobile interaction gadget created a gap between those that were able to use the device as it was deployed in the market and those that were not able to do so. The potential usages of these devices exploded and users that were at that time able to use mobile phones with physical keypads similarly to their peers, saw themselves living in the past of mobile interaction, from one day to the other. This was the case of blind people when smartphones started to dominate the market circa 2007.

[1]  Andrea Nuernberger,et al.  Presenting Accessibility to Mobility-Impaired Travelers , 2008 .

[2]  Jon Froehlich,et al.  Combining crowdsourcing and google street view to identify street-level accessibility problems , 2013, CHI.

[3]  Jon Froehlich,et al.  Exploring Early Solutions for Automatically Identifying Inaccessible Sidewalks in the Physical World using Google Street View , 2014 .

[4]  Uran Oh,et al.  Current and future mobile and wearable device use by people with visual impairments , 2014, CHI.

[5]  Jon Froehlich,et al.  Improving public transit accessibility for blind riders by crowdsourcing bus stop landmark locations with Google street view , 2013, ASSETS.

[6]  Richard E. Ladner,et al.  Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities , 2009, Assets '09.

[7]  Roger Mackett,et al.  AMELIA: A tool to make transport policies more socially inclusive , 2008 .

[8]  David J. Crandall,et al.  Privacy Concerns and Behaviors of People with Visual Impairments , 2015, CHI.

[9]  Luís Carriço,et al.  Under the table: tap authentication for smartphones , 2013, BCS HCI.

[10]  Mario Romero,et al.  BrailleTouch: Mobile Texting for the Visually Impaired , 2011, HCI.

[11]  Iyad Abu Doush,et al.  Utilizing Mobile Devices' Tactile Feedback for Presenting Braille Characters: An Optimized Approach for Fast Reading and Long Battery Life , 2014, Interact. Comput..

[12]  Luís Carriço,et al.  Towards ubiquitous awareness tools for blind people , 2013, BCS HCI.

[13]  J. Sallis,et al.  Assessing perceived physical environmental variables that may influence physical activity. , 1997, Research quarterly for exercise and sport.

[14]  Jon Froehlich,et al.  A feasibility study of crowdsourcing and google street view to determine sidewalk accessibility , 2012, ASSETS '12.

[15]  Jon Froehlich,et al.  An Initial Study of Automatic Curb Ramp Detection with Crowdsourced Verification Using Google Street View Images , 2013, HCOMP.

[16]  Joaquim A. Jorge,et al.  Blind people and mobile touch-based text-entry: acknowledging the need for different flavors , 2011, ASSETS.

[17]  Roberto Manduchi,et al.  Zebra Crossing Spotter: Automatic Population of Spatial Databases for Increased Safety of Blind Travelers , 2015, ASSETS.

[18]  Richard E. Ladner,et al.  PassChords: secure multi-touch authentication for blind people , 2012, ASSETS '12.

[19]  Donald R. Miller,et al.  Barriers, facilitators, and access for wheelchair users: substantive and methodologic lessons from a pilot study of environmental effects. , 2002, Social science & medicine.

[20]  Julien O. Teitler,et al.  Using Google Street View to audit neighborhood environments. , 2011, American journal of preventive medicine.

[21]  Kyle Montague,et al.  B#: chord-based correction for multitouch braille input , 2014, CHI.

[22]  Luís Carriço,et al.  UbiBraille: designing and evaluating a vibrotactile Braille-reading device , 2013, ASSETS.

[23]  Kyle Montague,et al.  HoliBraille: multipoint vibrotactile feedback on mobile devices , 2015, W4A.

[24]  Harry Hochheiser,et al.  Challenges in Universally Usable Privacy and Security , 2008 .

[25]  David Briggs,et al.  Modelling Access with GIS in Urban Systems (MAGUS): capturing the experiences of wheelchair users , 2003 .

[26]  Jacob O. Wobbrock,et al.  In the shadow of misperception: assistive technology use and social interactions , 2011, CHI.

[27]  Jon Froehlich,et al.  Tohme: detecting curb ramps in google street view using crowdsourcing, computer vision, and machine learning , 2014, UIST.

[28]  Kyle Montague,et al.  Getting Smartphones to Talkback: Understanding the Smartphone Adoption Process of Blind Users , 2015, ASSETS.

[29]  Cristian Bernareggi,et al.  TypeInBraille: a braille-based typing application for touchscreen devices , 2011, ASSETS.

[30]  Jeffrey P. Bigham,et al.  VizWiz: nearly real-time answers to visual questions , 2010, W4A.

[31]  Richard E. Ladner,et al.  V-braille: haptic braille perception using a touch-screen and vibration on mobile phones , 2010, ASSETS '10.

[32]  Luís Carriço,et al.  Assessing Inconspicuous Smartphone Authentication for Blind People , 2015, ArXiv.

[33]  Janet M Barlow,et al.  Pedestrian Mobility and Safety Audit Guide , 2008 .

[34]  Koji Yatani,et al.  SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices , 2009, UIST '09.

[35]  Luís Carriço,et al.  Measuring snooping behavior with surveys: it's how you ask it , 2014, CHI Extended Abstracts.

[36]  H. Badland,et al.  Can Virtual Streetscape Audits Reliably Replace Physical Streetscape Audits? , 2010, Journal of Urban Health.

[37]  Christine M. Hoehner,et al.  Perceived and objective environmental measures and physical activity among urban adults. , 2005, American journal of preventive medicine.

[38]  Richard E. Ladner,et al.  Input finger detection for nonvisual touch screen text entry in Perkinput , 2012, Graphics Interface.

[39]  H. Matthews,et al.  Mapping for Wheelchair Users: Route Navigation in Urban Spaces , 2006 .