Obstacle avoidance using haptics and a laser rangefinder

In its current form, the white cane has been used by visually impaired people for almost a century. It is one of the most basic yet useful navigation aids, mainly because of its simplicity and intuitive usage. For people who have a motion impairment in addition to a visual one, requiring a wheelchair or a walker, the white cane is impractical, leading to human assistance being a necessity. This paper presents the prototype of a virtual white cane using a laser rangefinder to scan the environment and a haptic interface to present this information to the user. Using the virtual white cane, the user is able to “poke” at obstacles several meters ahead and without physical contact with the obstacle. By using a haptic interface, the interaction is very similar to how a regular white cane is used. This paper also presents the results from an initial field trial conducted with six people with a visual impairment.

[1]  Anas Fattouh,et al.  Force feedback joystick control of a powered wheelchair: preliminary study , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[2]  Anatole Lécuyer,et al.  HOMERE: a multimodal system for visually impaired people to explore virtual environments , 2003, IEEE Virtual Reality, 2003. Proceedings..

[3]  R. Velazquez,et al.  New Test Structure for Tactile Display using Laterally Driven Tactors , 2008, 2008 IEEE Instrumentation and Measurement Technology Conference.

[4]  Michael G. Strintzis,et al.  Haptic Rendering of Visual Data for the Visually Impaired , 2007, IEEE MultiMedia.

[5]  Oussama Khatib,et al.  The haptic display of complex graphical environments , 1997, SIGGRAPH.

[6]  Akio Yamamoto,et al.  Electrostatic tactile display with thin film slider and its application to tactile telepresentation systems , 2004, IEEE Transactions on Visualization and Computer Graphics.

[7]  Sander Oude Elberink,et al.  Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications , 2012, Sensors.

[8]  Kenneth E. Barner,et al.  HAPTIC REPRESENTATION OF SCIENTIFIC DATA FOR VISUALLY IMPAIRED OR BLIND PERSONS , 1996 .

[9]  N. Franklin Language as a Means of Constructing and Conveying Cognitive Maps , 1996 .

[10]  Daniel O. Sales,et al.  Mobile Robots Navigation in Indoor Environments Using Kinect Sensor , 2012, 2012 Second Brazilian Conference on Critical Embedded Systems.

[11]  Oussama Khatib,et al.  DYNAMIC MODELS FOR HAPTIC RENDERING SYSTEMS , 1998 .

[12]  Mark Dixon,et al.  A haptic VR milling surgery simulator--using high-resolution CT-data. , 2006, Studies in health technology and informatics.

[13]  Sven Rönnbäck,et al.  On methods for assistive mobile robots , 2006 .

[14]  Manfred Huber,et al.  An assistive navigation paradigm using force feedback , 2009, ARSO 2009.

[15]  Håkan Fredriksson Laser on kinetic operator , 2010 .

[16]  Kalevi Hyyppä On a laser anglemeter for mobile robot navigation , 1993 .

[17]  Helen Petrie,et al.  Development of dialogue systems for a mobility aid for blind people: initial design and usability testing , 1996, Assets '96.

[18]  Federico Avanzini,et al.  Haptic-Auditory Rendering and Perception of Contact Stiffness , 2006, HAID.

[19]  Roberto Manduchi,et al.  Dynamic environment exploration using a virtual white cane , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[20]  Mark R. Cutkosky,et al.  Presenting spatial tactile messages with a hand-held device , 2011, 2011 IEEE World Haptics Conference.

[21]  Alistair D. N. Edwards,et al.  Improving the usability of speech-based interfaces for blind users , 1996, Assets '96.

[22]  G Jansson,et al.  Basic issues concerning visually impaired people's use of haptic displays , 2000 .