A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras

The use of multiple sensors for ego-motion estimation is an approach often used to provide more accurate and robust results. However, when representing ego-motion as a discrete series of poses, fusing information of unsynchronized sensors is not straightforward. The framework described in this paper aims to provide a unified solution for solving ego-motion estimation problems involving high-rate unsynchronized devices. Instead of a discrete-time pose representation, we present a continuous-time formulation that makes use of cumulative cubic B-Splines parameterized in the Lie Algebra of the group $$\mathbb {SE}3$$SE3. This trajectory representation has several advantages for sensor fusion: (1) it has local control, which enables sliding window implementations; (2) it is $$C^2$$C2 continuous, allowing predictions of inertial measurements; (3) it closely matches torque-minimal motions; (4) it has no singularities when representing rotations; (5) it easily handles measurements from multiple sensors arriving a different times when timestamps are available; and (6) it deals with rolling shutter cameras naturally. We apply this continuous-time framework to visual–inertial simultaneous localization and mapping and show that it can also be used to calibrate the entire system.

[1]  Tim D. Barfoot,et al.  Towards relative continuous-time SLAM , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Stergios I. Roumeliotis,et al.  A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation , 2008, IEEE Transactions on Robotics.

[3]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[4]  Chao Jia,et al.  Probabilistic 3-D motion estimation for rolling shutter video rectification from visual and inertial measurements , 2012, 2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP).

[5]  P. Crouch,et al.  The De Casteljau Algorithm on Lie Groups and Spheres , 1999 .

[6]  Gabe Sibley,et al.  Spline Fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras , 2013, BMVC.

[7]  Ian D. Reid,et al.  A hybrid SLAM representation for dynamic marine environments , 2010, 2010 IEEE International Conference on Robotics and Automation.

[8]  Javier Civera,et al.  Unified Inverse Depth Parametrization for Monocular SLAM , 2006, Robotics: Science and Systems.

[9]  Kaihuai Qin General matrix representations for B-splines , 2014, The Visual Computer.

[10]  C. D. Boor,et al.  On Calculating B-splines , 1972 .

[11]  Ian D. Reid,et al.  RSLAM: A System for Large-Scale Mapping in Constant-Time Using Stereo , 2011, International Journal of Computer Vision.

[12]  A. Vedaldi,et al.  Inertial Structure From Motion with Autocalibration , 2007 .

[13]  Andrew J. Davison,et al.  DTAM: Dense tracking and mapping in real-time , 2011, 2011 International Conference on Computer Vision.

[14]  Fukui Kazuhiro,et al.  Realistic CG Stereo Image Dataset With Ground Truth Disparity Maps , 2012 .

[15]  David W. Murray,et al.  Improving the Agility of Keyframe-Based SLAM , 2008, ECCV.

[16]  Tobias Pietzsch Efficient Feature Parameterisation for Visual SLAM Using Inverse Depth Bundles , 2008, BMVC.

[17]  Pierre Vandergheynst,et al.  FREAK: Fast Retina Keypoint , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[18]  Erik B. Dam,et al.  Quaternions, Interpolation and Animation , 2000 .

[19]  Ken Shoemake,et al.  Animating rotation with quaternion curves , 1985, SIGGRAPH.

[20]  Patrick Rives,et al.  Accurate Quadrifocal Tracking for Robust 3D Visual Odometry , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[21]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[22]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[23]  David W. Murray,et al.  Parallel Tracking and Mapping on a camera phone , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[24]  Paul Timothy Furgale,et al.  Continuous-time batch estimation using temporal basis functions , 2012, 2012 IEEE International Conference on Robotics and Automation.

[25]  Sung Yong Shin,et al.  A general construction scheme for unit quaternion curves with simple high order derivatives , 1995, SIGGRAPH.

[26]  Roland Siegwart,et al.  Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM , 2011, J. Intell. Robotic Syst..

[27]  S. Shankar Sastry,et al.  Geometric Models of Rolling-Shutter Cameras , 2005, ArXiv.

[28]  Sung Yong Shin,et al.  A C/sup 2/-continuous B-spline quaternion curve interpolating a given sequence of solid orientations , 1995, Proceedings Computer Animation'95.

[29]  Michael Felsberg,et al.  Rolling shutter bundle adjustment , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[30]  Kurt Konolige,et al.  Double window optimisation for constant time visual SLAM , 2011, 2011 International Conference on Computer Vision.

[31]  Richard Szeliski,et al.  Removing rolling shutter wobble , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[32]  Ken Shoemake,et al.  Quaternion calculus and fast animation , 1987 .

[33]  M. Cox The Numerical Evaluation of B-Splines , 1972 .

[34]  Hauke Strasdat,et al.  Scale Drift-Aware Large Scale Monocular SLAM , 2010, Robotics: Science and Systems.