An Accurate Localization Scheme for Mobile Robots Using Optical Flow in Dynamic Environments

Visual Simultaneous Localization and Mapping (Visual SLAM) has been studied for the past years and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, off-the-shelf Visual SLAM algorithms cannot localize the robot very accurately. To address this problem, we propose a novel method that uses optical flow to distinguish and eliminate dynamic feature points from extracted ones by using the RGB images as the only input. The static feature points are fed into the Visual SLAM algorithm for camera pose estimation. We integrate our method with the ORB-SLAM system and validate the proposed method with challenging dynamic sequences from the TUM dataset. The entire system can run in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of the Visual SLAM in dynamic scenarios.

[1]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[3]  David Nistér,et al.  An efficient solution to the five-point relative pose problem , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[4]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[5]  Andrew W. Fitzgibbon,et al.  Bundle Adjustment - A Modern Synthesis , 1999, Workshop on Vision Algorithms.

[6]  Thomas Moore,et al.  A Generalized Extended Kalman Filter Implementation for the Robot Operating System , 2014, IAS.

[7]  Jörg Stückler,et al.  Direct visual-inertial odometry with stereo cameras , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[8]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[9]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Dongheui Lee,et al.  RGB-D SLAM in Dynamic Environments Using Static Point Weighting , 2017, IEEE Robotics and Automation Letters.

[11]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[12]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[13]  Yuxiang Sun,et al.  Improving RGB-D SLAM in dynamic environments: A motion removal approach , 2017, Robotics Auton. Syst..

[14]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[15]  Daniel Cremers,et al.  Direct Sparse Odometry , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Simon Baker,et al.  Lucas-Kanade 20 Years On: A Unifying Framework , 2004, International Journal of Computer Vision.

[17]  Shoudong Huang,et al.  Towards dense moving object segmentation based robust dense RGB-D SLAM in dynamic scenarios , 2014, 2014 13th International Conference on Control Automation Robotics & Vision (ICARCV).

[18]  Jong-Hwan Kim,et al.  Visual Odometry Algorithm Using an RGB-D Sensor and IMU in a Highly Dynamic Environment , 2014, RiTA.