Rotational Coordinate Transformation for Visual-Inertial Sensor Fusion

Visual and inertial sensors are used collaboratively in many applications because of their complementary properties. The problem associated with sensor fusion is relative coordinate transformations. This paper presents a quaternion-based method to estimate the relative rotation between visual and inertial sensors. Rotation between a camera and an inertial measurement unit (IMU) is represented by quaternions, which are separately measured to allow the sensor to be optimized individually. Relative quaternions are used so that the global reference is not required to be known. The accuracy of the coordinate transformation was evaluated by comparing with a ground-truth tracking system. The experiment analysis proves the effectiveness of the proposed method in terms of accuracy and robustness.

[1]  Jindong Tan,et al.  Wearable Ego-Motion Tracking for Blind Navigation in Indoor Environments , 2015, IEEE Transactions on Automation Science and Engineering.

[2]  Kostas Daniilidis,et al.  Hand-Eye Calibration Using Dual Quaternions , 1999, Int. J. Robotics Res..

[3]  D. Tornqvist,et al.  Why would i want a gyroscope on my RGB-D sensor? , 2013, 2013 IEEE Workshop on Robot Vision (WORV).

[4]  Roland Siegwart,et al.  Visual-inertial SLAM for a small helicopter in large outdoor environments , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Jorge Dias,et al.  Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[7]  Markus Vincze,et al.  Simultaneous Motion and Structure Estimation by Fusion of Inertial and Vision Data , 2007, Int. J. Robotics Res..

[8]  Stergios I. Roumeliotis,et al.  A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation , 2008, IEEE Transactions on Robotics.

[9]  Darius Burschka,et al.  Optimization based IMU camera calibration , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Jorge Lobo,et al.  Camera-Inertial Sensor modelling and alignment for Visual Navigation , 2003 .

[11]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[12]  Daniele Mortari,et al.  How to Estimate Attitude from Vector Observations , 1999 .

[13]  Takeo Kanade,et al.  Inertial-aided KLT feature tracking for a moving camera , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Huosheng Hu,et al.  Integration of Vision and Inertial Sensors for 3D Arm Motion Tracking in Home-based Rehabilitation , 2007, Int. J. Robotics Res..

[15]  Thomas B. Schön,et al.  A new algorithm for calibrating a combined camera and IMU sensor unit , 2008, 2008 10th International Conference on Control, Automation, Robotics and Vision.

[16]  J. Keat Analysis of Least-Squares Attitude Determination Routine DOAO , 1977 .

[17]  Erik B. Dam,et al.  Quaternions, Interpolation and Animation , 2000 .