Tightly coupled Visual Inertial Odometry based on Artificial Landmarks

Motion planning and control of mobile robots rely on high-accuracy estimation of pose and velocity. Many researchers use motion capture system to estimate the robot state, but this system is usually expensive and can only be used indoor. For this reason, this paper proposes an artificial landmarks based cost-effective visual inertial odometry system where Iterative Extended Kalman Filter (IEKF) acts as the back-end optimizer. Since most tags used as artificial landmarks like bar code facing the problem of decoding the information during the detection, we redesign the tag which can simplify the deployment and improve the detection efficiency. The proposed IEKF filter framework makes the measurement of the fisheye monocular camera and the IMU combined to co-estimate the pose of robot, the location of detected corners in tag and the IMU bias. Additionally, on-line calibration of the IMU bias and the extrinsic parameters between IMU and camera can be done during the robot motion. By running our algorithm real-time on unmanned aerial vehicle (UAV) and comparing with the groundtruth data obtained from the Optitrack motion capture system, our algorithm can provide high-precision estimation of the pose and velocity of UAV.

[1]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[2]  Ieee Xplore,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  V. Knyaz,et al.  THE DEVELOPMENT OF NEW CODED TARGETS FOR AUTOMATED POINT IDENTIFICATION AND NON CONTACT 3D SURFACE MEASUREMENTS , 1998 .

[4]  J. Solà Towards visual localization, mapping and moving objects tracking by a mobile robot: a geometric and probabilistic approach , 2007 .

[5]  Javier Civera Sancho,et al.  Real-time ekf-based structure from motion , 2009 .

[6]  Edwin Olson,et al.  AprilTag: A robust and flexible visual fiducial system , 2011, 2011 IEEE International Conference on Robotics and Automation.

[7]  Vijay Kumar,et al.  Visual inertial odometry for quadrotors on SE(3) , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Javier Civera,et al.  Unified Inverse Depth Parametrization for Monocular SLAM , 2006, Robotics: Science and Systems.

[9]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[10]  Roland Siegwart,et al.  Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[12]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[13]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[14]  Eric Foxlin,et al.  Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[15]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[16]  Javier Civera,et al.  1-point RANSAC for EKF-based Structure from Motion , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Lance B. Gatrell,et al.  Robust image features: concentric contrasting circles and their image extraction , 1992, Other Conferences.

[18]  J. A. Castellanos,et al.  Limits to the consistency of EKF-based SLAM , 2004 .

[19]  J. S. Ortega Quaternion kinematics for the error-state KF , 2016 .