Bending Blindly: Exploring Bend Gestures for the Blind

This paper explores the novel context of using bend gestures as a primary method of interaction for the blind. Our preliminary study assesses if this more tactile experience could enhance the usability and accessibility of technology for blind users, by comparing bend and touch interactions with simulated blind participants. Both input techniques showed similar results, indicating that bend gestures have potential in this context. We identify results that can help shape future research in this accessibility area, and potentially increase the overall interaction experience for screen reader based smartphones.

[1]  Roel Vertegaal,et al.  Towards more paper-like input: flexible input devices for foldable interaction styles , 2008, UIST '08.

[2]  R R Berger The International Agency for the Prevention of Blindness. , 1994, American journal of ophthalmology.

[3]  H. Neville,et al.  Altered Cross-Modal Processing in the Primary Auditory Cortex of Congenitally Deaf Adults: A Visual-Somatosensory fMRI Study with a Double-Flash Illusion , 2012, The Journal of Neuroscience.

[4]  Alexander Keith Eady,et al.  One-Handed Bend Interactions with Deformable Smartphones , 2015, CHI.

[5]  Roel Vertegaal,et al.  PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays , 2011, CHI.

[6]  Tejah Venkata Balantrapu,et al.  The International Agency for the Prevention of Blindness (IAPB) Launches Essential Equipment List for Screening and Surgery for Trachomatous Trichiasis , 2015, Ophthalmic epidemiology.

[7]  Richard E. Ladner,et al.  Smartphone haptic feedback for nonvisual wayfinding , 2011, ASSETS.

[8]  Nikolaus F. Troje,et al.  Paper windows: interaction techniques for digital paper , 2005, CHI.

[9]  Richard E. Ladner,et al.  Input finger detection for nonvisual touch screen text entry in Perkinput , 2012, Graphics Interface.

[10]  Mario Romero,et al.  Mobile Texting for the Visually Impaired , 2011 .

[11]  Ravi Kuber,et al.  Towards identifying distinguishable tactons for use with mobile devices , 2009, Assets '09.

[12]  Richard E. Ladner,et al.  V-braille: haptic braille perception using a touch-screen and vibration on mobile phones , 2010, ASSETS '10.

[13]  Jan O. Borchers,et al.  Twend: twisting and bending as new interaction gesture in mobile devices , 2008, CHI Extended Abstracts.

[14]  R. Zatorre,et al.  A Functional Neuroimaging Study of Sound Localization: Visual Cortex Activity Predicts Performance in Early-Blind Individuals , 2005, PLoS biology.

[15]  Patrick Baudisch,et al.  Blindsight: eyes-free access to mobile phones , 2008, CHI.

[16]  Koji Yatani,et al.  SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices , 2009, UIST '09.

[17]  Mario Romero,et al.  BrailleTouch: Mobile Texting for the Visually Impaired , 2011, HCI.

[18]  Audrey Girouard,et al.  Bending the rules: bend gesture classification for flexible displays , 2013, CHI.

[19]  Kun-Pyo Lee,et al.  Exploring the effects of size on deformable user interfaces , 2012, Mobile HCI.

[20]  Stefan Rennick Egglestone,et al.  Human-Computer Interaction - INTERACT 2011 - 13th IFIP TC 13 International Conference, Lisbon, Portugal, September 5-9, 2011, Proceedings, Part I , 2011, INTERACT.

[21]  Richard E. Ladner,et al.  Usable gestures for blind people: understanding preference and performance , 2011, CHI.

[22]  Gahangir Hossain,et al.  Design, development and performance evaluation of reconfigured mobile Android phone for people who are blind or visually impaired , 2010, SIGDOC '10.

[23]  Johan Kildal,et al.  Interacting with Deformable User Interfaces: Effect of Material Stiffness and Type of Deformation Gesture , 2012, HAID.

[24]  Roope Raisamo,et al.  Methods for Presenting Braille Characters on a Mobile Device with a Touchscreen and Tactile Feedback , 2009, IEEE Transactions on Haptics.

[25]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.

[26]  C. V. BOYS,et al.  The Slide Rule , 1885, Nature.

[27]  Jun-ichiro Watanabe,et al.  Bookisheet: bendable device for browsing content using the metaphor of leafing through the pages , 2008, UbiComp.

[28]  R. Saxe,et al.  Language processing in the occipital cortex of congenitally blind adults , 2011, Proceedings of the National Academy of Sciences.

[29]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[30]  Audrey Girouard,et al.  Fabricating Bendy: Design and Development of Deformable Prototypes , 2014, IEEE Pervasive Computing.

[31]  Barbara Leporini,et al.  Interacting with mobile devices via VoiceOver: usability and accessibility issues , 2012, OZCHI.

[32]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[33]  Barbara Leporini,et al.  Haptic reference cues to support the exploration of touchscreen mobile devices by blind users , 2013, CHItaly '13.

[34]  Joaquim A. Jorge,et al.  BrailleType: Unleashing Braille over Touch Screen Mobile Phones , 2011, INTERACT.

[35]  Susanna Paasovaara,et al.  Kinetic device: designing interactions with a deformable mobile interface , 2012, CHI Extended Abstracts.

[36]  Teemu Tuomas Ahmaniemi,et al.  What is a device bend gesture really good for? , 2014, CHI.

[37]  Ivan Poupyrev,et al.  Gummi: a bendable computer , 2004, CHI '04.

[38]  Pattie Maes,et al.  Flexpad: highly flexible bending interactions for projected handheld displays , 2013, CHI.

[39]  Jacob O. Wobbrock,et al.  Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques , 2008, Assets '08.