The design and implementation of a walking assistant system with vibrotactile indication and voice prompt for the visually impaired

This paper presents a comprehensive navigation system to assist the visually impaired in safely walking in their living areas. The system can provide the visually impaired vibrotactile hints of navigation and free directions during outdoor walking. The cluttered environmental information is acquired by Kinect cameras and ultrasonic sensors in real time. All the direction and obstacle avoidance instructions are transformed into a series of vibrating stimulus and actuated by the vibration belt. Meanwhile, GPS and speech recognition modules are used for navigation. Users can set destination through speech recognition module and get their current location by GPS module. With the help of Google terminal, the system can provide a wide range of path navigation to users. Three experiments are conducted to testify the effectiveness of the system. The results show that the method of detecting environment and indication way is effective.

[1]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[2]  James Weiland,et al.  A wearable system for the visually impaired , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[3]  R. Kowalik,et al.  An ultrasonic obstacle detector based on phase beamforming principles , 2006, IEEE Sensors Journal.

[4]  Shanbao Tong,et al.  SoundView: An auditory guidance system based on environment understanding for the visually impaired people , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[5]  Ying Jie,et al.  Obstacle Detection of a Novel Travel Aid for Visual Impaired People , 2012, 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics.

[6]  N. Bourbakis,et al.  Sensing Surrounding 3-D Space for Navigation of the Blind , 2008, IEEE Engineering in Medicine and Biology Magazine.

[7]  First Tatsuya Seto,et al.  A navigation system for the visually impaired using colored navigation lines and RFID tags , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[8]  Soo-Chang Pei,et al.  Census-based vision for auditory depth images and speech navigation of visually impaired users , 2011, IEEE Transactions on Consumer Electronics.

[9]  Gary R. Bradski,et al.  Learning OpenCV - computer vision with the OpenCV library: software that sees , 2008 .

[10]  Juan Wu,et al.  Vibrotactile Representation of Three-Dimensional Shape and Implementation on a Vibrotactile Pad , 2013 .

[11]  Lynette A. Jones,et al.  Tactile Displays: Guidance for Their Design and Application , 2008, Hum. Factors.

[12]  Hisayuki Tatsumi,et al.  RFID for aiding the visually impaired recognize surroundings , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.

[13]  Frank Dellaert,et al.  SWAN: System for Wearable Audio Navigation , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[14]  Jaime Sánchez,et al.  Audio and haptic based virtual environments for orientation and mobility in people who are blind , 2010, ASSETS '10.

[15]  S. Harput,et al.  Ultrasonic Phased Array Device for Acoustic Imaging in Air , 2008, IEEE Sensors Journal.

[16]  Lorenzo Scalise,et al.  Experimental Investigation of Electromagnetic Obstacle Detection for Visually Impaired Users: A Comparison With Ultrasonic Sensing , 2012, IEEE Transactions on Instrumentation and Measurement.

[17]  Yunhui Liu,et al.  A wearable stereo vision system for visually impaired , 2012, 2012 IEEE International Conference on Mechatronics and Automation.

[18]  Hisayuki Tatsumi,et al.  A Haptic Walk-Guide simulator for the visually impaired - A prototype - , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.