Using wrist vibrations to guide hand movement and whole body navigation

Summary In the absence of vision, mobility and orientation are challenging. Audio and tactile feedback can be used to guide visually impaired people. In this paper, we present two complementary studies on the use of vibrational cues for hand guidance during the exploration of itineraries on a map, and whole body-guidance in a virtual environment. Concretely, we designed wearable Arduino bracelets integrating a vibratory motor producing multiple patterns of pulses. In a first study, this bracelet was used for guiding the hand along unknown routes on an interactive tactile map. A wizard-of-Oz study with six blindfolded participants showed that tactons, vibrational patterns, may be more efficient than audio cues for indicating directions. In a second study, this bracelet was used by blindfolded participants to navigate in a virtual environment. The results presented here show that it is possible to significantly decrease travel distance with vibrational cues. To sum up, these preliminary but complementary studies suggest the interest of vibrational feedback in assistive technology for mobility and orientation for blind people.

[1]  Thomas Hulin,et al.  Evaluation of a vibrotactile feedback device for spatial guidance , 2011, 2011 IEEE World Haptics Conference.

[2]  Richard E. Ladner,et al.  Access overlays: improving non-visual access to large touch screens for blind users , 2011, UIST.

[3]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[4]  Anke M. Brock,et al.  Interactivity Improves Usability of Geographic Maps for Visually Impaired People , 2015, Hum. Comput. Interact..

[5]  Bruce N. Walker,et al.  Navigation performance in a virtual environment with bonephones , 2005 .

[6]  Bernard Oriola,et al.  Map design for visually impaired people: past, present, and future research , 2013 .

[7]  Helen Petrie,et al.  Development of dialogue systems for a mobility aid for blind people: initial design and usability testing , 1996, Assets '96.

[8]  Mandayam A. Srinivasan,et al.  BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training , 2010, 2010 IEEE Haptics Symposium.

[9]  Abdulmotaleb El Saddik,et al.  Haptics Technologies: Bringing Touch to Multimedia , 2011 .

[10]  Tiago João Vieira Guerreiro,et al.  SIG: NVI (non-visual interaction) , 2013, CHI Extended Abstracts.

[11]  Eyal de Lara,et al.  Timbremap: enabling the visually-impaired to use maps on touch-enabled devices , 2010, Mobile HCI.

[12]  Seungmoon Choi,et al.  Vibrotactile Display: Perception, Technology, and Applications , 2013, Proceedings of the IEEE.

[13]  Niels Henze,et al.  Supporting map-based wayfinding with tactile cues , 2009, Mobile HCI.

[14]  Alistair D. N. Edwards,et al.  Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people , 2013, CHI Extended Abstracts.

[15]  Barbara Hayes-Roth,et al.  Differences in spatial knowledge acquired from maps and navigation , 1982, Cognitive Psychology.

[16]  Christophe Jouffrais,et al.  Multimodal virtual environment subserving the design of electronic orientation aids for the blind , 2012, VRST '12.

[17]  J. F. Kelley,et al.  An iterative design methodology for user-friendly natural language office information applications , 1984, TOIS.

[18]  Gert Jan Gelderblom,et al.  Inventory of Electronic Mobility Aids for Persons with Visual Impairments: A Literature Review , 2008 .

[19]  Christophe Jouffrais,et al.  Guiding Blind People with Haptic Feedback , 2012 .

[20]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[21]  Amy Hurst,et al.  "Pray before you step out": describing personal and situational blind navigation behaviors , 2013, ASSETS.

[22]  Bernard Oriola,et al.  NAVIG: Guidance system for the visually impaired using virtual augmented reality , 2012 .

[23]  Koji Yatani,et al.  SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback , 2012, CHI.

[24]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[25]  Reginald G. Golledge,et al.  Path Selection and Route Preference in Human Navigation: A Progress Report , 1995, COSIT.

[26]  Marilyn Tremaine CHI '01 Extended Abstracts on Human Factors in Computing Systems , 2001, CHI Extended Abstracts.

[27]  Minoru Kamata,et al.  Expression of paths and buildings for universal designed interactive map with due consideration for visually impaired people , 2008, 2008 IEEE International Conference on Systems, Man and Cybernetics.

[28]  Jaime Sánchez,et al.  AudioMUD: A Multi-User Virtual Environment for Blind People , 2006 .

[29]  Martin Pielot,et al.  A Tactile Compass for Eyes-Free Pedestrian Navigation , 2011, INTERACT.

[30]  Rick Kazman,et al.  Audio enhanced 3D interfaces for visually impaired users , 1996, CHI '96.

[31]  M. Ernst,et al.  Walking Straight into Circles , 2009, Current Biology.

[32]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .