The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the Blind

We introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text. We discuss our vision for HandSight, describe its current implementation and results from an initial performance analysis of finger-based text scanning. We then present a user study with four visually impaired participants (three blind) exploring how to continuously guide a user’s finger across text using three feedback conditions (haptic, audio, and both). Though preliminary, our results show that participants valued the ability to access printed material, and that, in contrast to previous findings, audio finger guidance may result in the best reading performance.

[1]  Eelke Folmer,et al.  GIST: a gestural interface for remote nonvisual spatial perception , 2013, UIST.

[2]  Susumu Harada,et al.  On the audio representation of radial direction , 2011, CHI.

[3]  Sethuraman Panchanathan,et al.  Person localization using a wearable camera towards enhancing social interactions for individuals with visual impairment , 2009, MSIADU '09.

[4]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[5]  Joel A. Hesch,et al.  Design and Analysis of a Portable Indoor Localization Aid for the Visually Impaired , 2010, Int. J. Robotics Res..

[6]  Bernard Gosselin,et al.  Mobile Reading Assistant for Blind People , 2004 .

[7]  Beryl Plimmer,et al.  Signing on the tactile line: A multimodal system for teaching handwriting to blind children , 2011, TCHI.

[8]  James A. Landay,et al.  Supporting everyday activities through always-available mobile computing , 2010 .

[9]  Shwetak N. Patel,et al.  The haptic laser: multi-sensation tactile feedback for at-a-distance physical space perception and interaction , 2011, CHI.

[10]  Seung-Chan Kim,et al.  Tactile feedback on flat surfaces for the visually impaired , 2012, CHI Extended Abstracts.

[11]  Ingrid M. Kanics,et al.  Tactile Acuity is Enhanced in Blindness , 2003, The Journal of Neuroscience.

[12]  T. Scott Saponas Enabling always-available input: through on-body interfaces , 2009, CHI Extended Abstracts.

[13]  Jeffrey P. Bigham,et al.  Supporting blind photography , 2011, ASSETS.

[14]  Abdelsalam Helal,et al.  Drishti: an integrated navigation system for visually impaired and disabled , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[15]  Alan L. Yuille,et al.  Detecting and reading text in natural scenes , 2004, CVPR 2004.

[16]  Koji Yatani,et al.  SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback , 2012, CHI.

[17]  Sarah L. Dowhower Repeated Reading: Research into Practice. , 1989 .

[18]  Sethuraman Panchanathan,et al.  A wearable face recognition system for individuals with visual impairments , 2005, Assets '05.

[19]  Kenneth O. Johnson,et al.  Neural Basis of Haptic Perception , 2002 .

[20]  Sazali Yaacob,et al.  Wearable Real-Time Stereo Vision for the Visually Impaired , 2007, Eng. Lett..

[21]  Nikolaos G. Bourbakis,et al.  The Development and Evaluation of an Eyes-Free Interaction Model for Mobile Reading Devices , 2013, IEEE Transactions on Human-Machine Systems.

[22]  C. V. Jawahar,et al.  Perspective Correction Methods for Camera-Based Document Analysis , 2005 .

[23]  Suranga Nanayakkara,et al.  EyeRing: a finger-worn input device for seamless interactions with our surroundings , 2013, AH.

[24]  E. E. FOURNIER D'ALBE,et al.  The Type-Reading Optophone , 1914, Nature.

[25]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[26]  Zhenwen Dai,et al.  Autonomous cleaning of corrupted scanned documents - A generative modeling approach , 2012, CVPR.

[27]  Koji Yatani,et al.  SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices , 2009, UIST '09.

[28]  James C. Bliss A Relatively High-Resolution Reading Aid for the Blind , 1969 .

[29]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[30]  Roberto Manduchi,et al.  The last meter: blind visual guidance to a target , 2014, CHI.

[31]  Jacob O. Wobbrock,et al.  Access lens: a gesture-based screen reader for real-world documents , 2013, CHI.

[32]  J. Norman,et al.  Blindness enhances tactile acuity and haptic 3-D shape discrimination , 2011, Attention, perception & psychophysics.

[33]  Suranga Nanayakkara,et al.  FingerReader: a wearable device to support text reading on the go , 2014, CHI Extended Abstracts.

[34]  Ching Y. Suen,et al.  Historical review of OCR research and development , 1992, Proc. IEEE.

[35]  M. Capp,et al.  The optophone: an electronic blind aid , 2000 .

[36]  Thomas Ertl,et al.  Design and development of an indoor navigation and object identification system for the blind , 2004, Assets '04.

[37]  Roberto Manduchi Mobile Vision as Assistive Technology for the Blind: An Experimental Study , 2012, ICCHP.

[38]  Zhenwen Dai,et al.  Autonomous cleaning of corrupted scanned documents — A generative modeling approach , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[39]  Tom Drummond,et al.  Fusing points and lines for high performance tracking , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[40]  Ibrahim S. I. Abuhaiba,et al.  EFFICIENT OCR USING SIMPLE FEATURES AND DECISION TREES WITH BACKTRACKING , 2006 .

[41]  Rácz-Akácosi Attila Samsung Galaxy Gear , 2013 .

[42]  Sethuraman Panchanathan,et al.  A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired , 2008 .

[43]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[44]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[45]  Kai Wang,et al.  End-to-end scene text recognition , 2011, 2011 International Conference on Computer Vision.

[46]  Desney S. Tan,et al.  Emerging Input Technologies for Always-Available Mobile Interaction , 2011, Found. Trends Hum. Comput. Interact..

[47]  P. W. Nye,et al.  Evolution of reading machines for the blind: Haskins Laboratories' research as a case history. , 1984, Journal of rehabilitation research and development.

[48]  Uran Oh,et al.  Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures , 2013, ASSETS.

[49]  D. Pascolini,et al.  Global estimates of visual impairment: 2010 , 2011, British Journal of Ophthalmology.

[50]  Stephen A. Brewster,et al.  Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users , 2008, TACC.

[51]  Eyal de Lara,et al.  Timbremap: enabling the visually-impaired to use maps on touch-enabled devices , 2010, Mobile HCI.

[52]  James M. Coughlan,et al.  Towards a Real-Time System for Finding and Reading Signs for Visually Impaired Users , 2012, ICCHP.

[53]  A Min Tjoa,et al.  Exploiting SenseCam for Helping the Blind in Business Negotiations , 2006, ICCHP.

[54]  C Tomasi,et al.  Shape and motion from image streams: a factorization method. , 1992, Proceedings of the National Academy of Sciences of the United States of America.

[55]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .