Sensor-guided jogging for visually impaired

This paper introduces an approach for enabling visually impaired and blind people to practice jogging activities by 3D environment perception for course detection and collision avoidance, as well as feedback generation in an intuitive manner. Besides a system concept, first prototypic realizations, that confirm the general feasibility, are presented for this domain, which has not been addressed by research until now.

[1]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[2]  J. Canny A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  S. Yaacob,et al.  Application of stereovision in a navigation aid for blind people , 2003, Fourth International Conference on Information, Communications and Signal Processing, 2003 and the Fourth Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint.

[4]  K. Koshi,et al.  Orientation aids for the blind using ultrasonic signpost system , 1999, Proceedings of the First Joint BMES/EMBS Conference. 1999 IEEE Engineering in Medicine and Biology 21st Annual Conference and the 1999 Annual Fall Meeting of the Biomedical Engineering Society (Cat. N.

[5]  Nikolaos Bourbakis,et al.  A System-Prototype Representing 3D Space via Alternative-Sensing for Visually Impaired Navigation , 2013, IEEE Sensors Journal.

[6]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[7]  Matteo Munaro,et al.  Tracking people within groups with RGB-D data , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  J. Borenstein,et al.  The NavBelt-a computerized travel aid for the blind based on mobile robotics technology , 1998, IEEE Transactions on Biomedical Engineering.

[9]  Jiri Matas,et al.  Robust Detection of Lines Using the Progressive Probabilistic Hough Transform , 2000, Comput. Vis. Image Underst..

[10]  K STEINER,et al.  [International statistical classification of diseases, injuries and causes of death]. , 1949, Vojenske zdravotnicke listy.

[11]  Radu Bogdan Rusu,et al.  Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments , 2010, KI - Künstliche Intelligenz.

[12]  Andrew G. Dempster,et al.  Indoor positioning system based on sensor fusion for the Blind and Visually Impaired , 2012, 2012 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[13]  Armando Barreto,et al.  Portable 3D Sound / Sonar Navigation System for Blind Individuals , 2004 .

[14]  Nico Blodow,et al.  Towards 3D Point cloud based object maps for household environments , 2008, Robotics Auton. Syst..

[15]  Roger Gassert,et al.  Advanced Augmented White Cane with obstacle height and distance feedback , 2013, 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR).

[16]  Iwan Ulrich,et al.  The GuideCane-applying mobile robot technologies to assist the visually impaired , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[17]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[18]  J.L. Gonzalez-Mora,et al.  Seeing the world by hearing: Virtual Acoustic Space (VAS) a new space perception system for blind people. , 2006, 2006 2nd International Conference on Information & Communication Technologies.