A simple and parallel algorithm for robot position estimation by stereo visual-inertial sensor fusion

The stereo visual-inertial sensor fusion is very popular in the near decade for the robot position estimation, because it is effective in the environments without the global position information. Fusing the stereo vision and the AHRS sensor (Attitude and Heading Reference System, three axial gyroscopes, accelerometers and magnetometers), and using the perspective projection property of the camera, we propose a new adaptive algorithm to estimate the robot position. Through the novel real-time method, the robot can be positioned in the environments without the global position information, and we can theoretically prove that the estimated positions converge to their true values. The new algorithm can be simply implement and parallel processed by GPU (Graphics Processing Unit), to compute the real-time positions. The performance of the real-time position estimation algorithm is validated by an experiment.

[1]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[2]  Heiko Hirschmüller,et al.  Multisensor data fusion for robust pose estimation of a six-legged walking robot , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Otmar Hilliges,et al.  Duo-VIO: Fast, light-weight, stereo inertial odometry , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  Weiping Li,et al.  Applied Nonlinear Control , 1991 .

[5]  Jörg Stückler,et al.  Direct visual-inertial odometry with stereo cameras , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Heiko Hirschmüller,et al.  Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device , 2013, 2013 IEEE International Conference on Robotics and Automation.

[7]  Rong Xiong,et al.  Stereo Visual-Inertial Odometry With Multiple Kalman Filters Ensemble , 2016, IEEE Transactions on Industrial Electronics.

[8]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[9]  Roland Siegwart,et al.  Robust embedded egomotion estimation , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Roland Siegwart,et al.  Dense visual-inertial navigation system for mobile robots , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Alonzo Kelly,et al.  A new approach to vision-aided inertial navigation , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Luyang Li,et al.  A Simple and Parallel Algorithm for Real-Time Robot Localization by Fusing Monocular Vision and Odometry/AHRS Sensors , 2014, IEEE/ASME Transactions on Mechatronics.