Human navigation assistance with a RGB-D sensor

This paper focuses on the creation of a human navigation assistance prototype. The system uses a conventional RGB-D camera and a laptop to analyze the environment surrounding the user and provides him with enough information for a safe navigation. The system is designed to work indoors and performs two main tasks: floor and obstacle detection and staircase detection. Both tasks make use of the range and visual information captured by the sensor. The camera points downwards, allowing to acquire relevant navigation information without invading the privacy of other people. The system has been tested in real environments showing good results in the detection of obstacles and staircase.

[1]  João Barroso,et al.  Blind Navigation Support System based on Microsoft Kinect , 2012, DSAI.

[2]  Maren Bennewitz,et al.  Improved proposals for highly accurate localization using range and vision data , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Jason J. Liu,et al.  Video-based localization without 3D mapping for the visually impaired , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[4]  G. Medioni,et al.  RGB-D camera Based Navigation for the Visually Impaired , 2011 .

[5]  Guido M. Cortelazzo,et al.  Fusion of Geometry and Color Information for Scene Segmentation , 2012, IEEE Journal of Selected Topics in Signal Processing.

[6]  Aleksandra Pizurica,et al.  Obstacle detection for pedestrians with a visual impairment based on 3D imaging , 2013, 2013 International Conference on 3D Imaging.

[7]  Josechu J. Guerrero,et al.  Inverse depth for accurate photometric and geometric error minimisation in RGB-D dense visual odometry , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Alan L. Yuille,et al.  Manhattan World: compass direction from a single image by Bayesian inference , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[9]  Hong Liu,et al.  Segment and Label Indoor Scene Based on RGB-D for the Visually Impaired , 2014, MMM.

[10]  B. S. Manjunath,et al.  Color image segmentation , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[11]  Sebastian Thrun,et al.  Self-supervised Monocular Road Detection in Desert Terrain , 2006, Robotics: Science and Systems.

[12]  Josechu J. Guerrero,et al.  Detection and Modelling of Staircases Using a Wearable Depth Sensor , 2014, ECCV Workshops.

[13]  Stan Birchfield,et al.  Real-time obstacle detection and avoidance in the presence of specular surfaces using an active 3D sensor , 2013, 2013 IEEE Workshop on Robot Vision (WORV).

[14]  Roberto Manduchi,et al.  Detection and Localization of Curbs and Stairways Using Stereo Vision , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[15]  Joel A. Hesch,et al.  Descending-stair detection, approach, and traversal with an autonomous tracked vehicle , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  First Tatsuya Seto,et al.  A navigation system for the visually impaired using colored navigation lines and RFID tags , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[17]  Josechu J. Guerrero,et al.  Navigation Assistance for the Visually Impaired Using RGB-D Sensor With Range Expansion , 2016, IEEE Systems Journal.

[18]  Jizhong Xiao,et al.  Semantic Indoor Navigation with a Blind-User Oriented Augmented Reality , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[19]  Stanley T. Birchfield,et al.  Image-based segmentation of indoor corridor floors for a mobile robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Nikolaos G. Bourbakis,et al.  Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[21]  David W. Murray,et al.  On the Choice and Placement of Wearable Vision Sensors , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[22]  C. E. Pereira,et al.  Analysis and design of an embedded system to aid the navigation of the visually impaired , 2013, 2013 ISSNIP Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC).

[23]  Mo M. Jamshidi,et al.  Sonar-Based Rover Navigation for Single or Multiple Platforms: Forward Safe Path and Target Switching Approach , 2008, IEEE Systems Journal.

[24]  S. Yaacob,et al.  Application of stereovision in a navigation aid for blind people , 2003, Fourth International Conference on Information, Communications and Signal Processing, 2003 and the Fourth Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint.

[25]  Karsten Berns,et al.  3D obstacle detection and avoidance in vegetated off-road terrain , 2008, 2008 IEEE International Conference on Robotics and Automation.

[26]  Nobuo Ezaki,et al.  Kinect cane: Object recognition aids for the visually impaired , 2013, 2013 6th International Conference on Human System Interactions (HSI).

[27]  Shuihua Wang,et al.  Detecting stairs and pedestrian crosswalks for the blind by RGBD camera , 2012, 2012 IEEE International Conference on Bioinformatics and Biomedicine Workshops.

[28]  Wai Ho Li,et al.  Plane-based detection of staircases using inverse depth , 2012, ICRA 2012.