Dense Continuous-Time Tracking and Mapping with Rolling Shutter RGB-D Cameras

We propose a dense continuous-time tracking and mapping method for RGB-D cameras. We parametrize the camera trajectory using continuous B-splines and optimize the trajectory through dense, direct image alignment. Our method also directly models rolling shutter in both RGB and depth images within the optimization, which improves tracking and reconstruction quality for low-cost CMOS sensors. Using a continuous trajectory representation has a number of advantages over a discrete-time representation (e.g. camera poses at the frame interval). With splines, less variables need to be optimized than with a discrete representation, since the trajectory can be represented with fewer control points than frames. Splines also naturally include smoothness constraints on derivatives of the trajectory estimate. Finally, the continuous trajectory representation allows to compensate for rolling shutter effects, since a pose estimate is available at any exposure time of an image. Our approach demonstrates superior quality in tracking and reconstruction compared to approaches with discrete-time or global shutter assumptions.

[1]  Andrew J. Davison,et al.  A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[3]  Per-Erik Forssén,et al.  Efficient Video Rectification and Stabilisation for Cell-Phones , 2012, International Journal of Computer Vision.

[4]  David W. Murray,et al.  Parallel Tracking and Mapping on a camera phone , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[5]  Jose Luis Blanco,et al.  A tutorial on SE(3) transformation parameterizations and on-manifold optimization , 2012 .

[6]  Andrew I. Comport,et al.  A Unified Rolling Shutter and Motion Blur Model for 3D Visual Registration , 2013, 2013 IEEE International Conference on Computer Vision.

[7]  Anthony J. Yezzi,et al.  A Compact Formula for the Derivative of a 3-D Rotation in Exponential Coordinates , 2013, Journal of Mathematical Imaging and Vision.

[8]  W. Burgard,et al.  D Mapping with an RGB-D Camera , 2014 .

[9]  Per-Erik Forssén,et al.  Scan rectification for structured light range sensors with rolling shutters , 2011, 2011 International Conference on Computer Vision.

[10]  Andrew I. Comport,et al.  On unifying key-frame and voxel-based dense visual SLAM at large scales , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Daniel Cremers,et al.  Robust odometry estimation for RGB-D cameras , 2013, 2013 IEEE International Conference on Robotics and Automation.

[12]  Daniel Cremers,et al.  Dense visual SLAM for RGB-D cameras , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Paul Timothy Furgale,et al.  Continuous-time batch estimation using temporal basis functions , 2012, 2012 IEEE International Conference on Robotics and Automation.

[14]  Michael Felsberg,et al.  Rolling shutter bundle adjustment , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Wolfram Burgard,et al.  3-D Mapping With an RGB-D Camera , 2014, IEEE Transactions on Robotics.

[17]  Gabe Sibley,et al.  Spline Fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras , 2013, BMVC.

[18]  Per-Erik Forssén,et al.  Improving RGB-D Scene Reconstruction Using Rolling Shutter Rectification , 2015 .

[19]  Richard Szeliski,et al.  Removing rolling shutter wobble , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.