A new pose estimation method based on inertial and visual sensors for autonomous robots

The inertial and visual sensors are popular sensors on board for robots. The combination of inertial and visual information is helpful to estimate the pose of a robot robustly. In this paper, a new sensing system with inertial and active visual sensors is proposed. A hierarchical fusion strategy is suggested. And a pose estimation method with optimization fusion is presented, which takes image features as constraints to optimize inertial parameters. It has the potential to estimate the pose faster and more accurate. Simulations are also provided in order to assess the proposed integration method conveniently.

[1]  Giulio Sandini,et al.  Learning visual stabilization reflexes in robots with moving eyes , 2002, Neurocomputing.

[2]  Giulio Sandini,et al.  Visuo-inertial stabilization in space-variant binocular systems , 2000, Robotics Auton. Syst..

[3]  Bijoy K. Ghosh,et al.  Multi-rate fusion of visual and inertial data , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).

[4]  Per Skoglar,et al.  Navigation Aided Image Processing in UAV Surveillance: Preliminary Results and Design of an Airborne Experimental System , 2004, J. Field Robotics.

[5]  Jun-ichi Takiguchi,et al.  Development of a forward-hemispherical vision sensor for acquisition of a panoramic integration map , 2004, 2004 IEEE International Conference on Robotics and Biomimetics.

[6]  D. Baehring,et al.  Detection of close cut-in and overtaking vehicles for driver assistance based on planar parallax , 2005, IEEE Proceedings. Intelligent Vehicles Symposium, 2005..

[7]  Bijoy K. Ghosh,et al.  Pose estimation using line-based dynamic vision and inertial sensors , 2003, IEEE Trans. Autom. Control..

[8]  Bernd Hürtgen,et al.  Lane following combining vision and DGPS , 2000, Image Vis. Comput..

[9]  Zhencheng Hu,et al.  Real-time data fusion on tracking camera pose for direct visual guidance , 2004, IEEE Intelligent Vehicles Symposium, 2004.

[10]  J McIntyre,et al.  A 6 D.O.F. opto-inertial tracker for virtual reality experiments in microgravity. , 2001, Acta astronautica.

[11]  Yang Cheng,et al.  Path following using visual odometry for a Mars rover in high-slip environments , 2004, 2004 IEEE Aerospace Conference Proceedings (IEEE Cat. No.04TH8720).

[12]  S. Niwa,et al.  Kalman filter with time-variable gain for a multisensor fusion system , 1999, Proceedings. 1999 IEEE/SICE/RSJ. International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI'99 (Cat. No.99TH8480).

[13]  Stéphane Viollet,et al.  A high speed gaze control system based on the Vestibulo-Ocular Reflex , 2005, Robotics Auton. Syst..

[14]  Kiichiro Izumida,et al.  A combination of monocular CCD camera and inertial-sensor for range estimation , 2002, IEEE 2002 28th Annual Conference of the Industrial Electronics Society. IECON 02.

[15]  Stergios I. Roumeliotis,et al.  Multi-sensor, high speed autonomous stair climbing , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Jacques Waldmann,et al.  Line-of-sight rate estimation and linearizing control of an imaging seeker in a tactical missile guided by proportional navigation , 2002, IEEE Trans. Control. Syst. Technol..

[17]  Markus Vincze,et al.  Fusion of Vision and Inertial Data for Motion and Structure Estimation , 2004, J. Field Robotics.

[18]  Frédéric Labrosse,et al.  The visual compass: Performance and limitations of an appearance‐based method , 2006, J. Field Robotics.

[19]  Yuanxin Wu,et al.  Observability analysis of rotation estimation by fusing inertial and line-based visual information: A revisit , 2005, at - Automatisierungstechnik.

[20]  Peter Corke An Inertial and Visual Sensing System for a Small Autonomous Helicopter , 2004, J. Field Robotics.

[21]  Dario Floreano,et al.  Fly-inspired visual steering of an ultralight indoor aircraft , 2006, IEEE Transactions on Robotics.

[22]  Axel Pinz,et al.  A Flexible Software Architecture for Hybrid Tracking , 2004 .

[23]  Markus Vincze,et al.  Multi-rate fusion with vision and inertial sensors , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[24]  Carme Torras,et al.  Fusing Visual and Inertial Sensing to Recover Robot Ego-motion , 2004 .

[25]  Tom Drummond,et al.  Tightly integrated sensor fusion for robust visual tracking , 2004, Image Vis. Comput..

[26]  Jihoon Choi,et al.  A bimodal approach for GPS and IMU integration for land vehicle applications , 2003, 2003 IEEE 58th Vehicular Technology Conference. VTC 2003-Fall (IEEE Cat. No.03CH37484).

[27]  Colin J. Taylor,et al.  3D environment capture from monocular video and inertial data , 2006, Electronic Imaging.

[28]  Stevica Graovac Principles of Fusion of Inertial Navigation and Dynamic Vision , 2004, J. Field Robotics.

[29]  Paul Smith,et al.  Computing MAP trajectories by representing, propagating and combining PDFs over groups , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[30]  Irem Stratmann,et al.  Omnidirectional Vision and Inertial Clues for Robot Navigation , 2004 .