From the Calibration of a Light-Field Camera to Direct Plenoptic Odometry

This paper presents a complete framework from the calibration of a plenoptic camera toward plenoptic camera based visual odometry. This is achieved by establishing the multiple view geometry for plenoptic cameras. Based on this novel multiple view geometry, a calibration approach is developed. The approach optimizes all intrinsic parameters of the plenoptic camera model, the 3D coordinates of the calibration points, and all camera poses in a single bundle adjustment. Our plenoptic camera based visual odometry algorithm, called direct plenoptic odometry (DPO), is a direct and semi-dense approach, which takes advantage of the full sensor resolution. DPO also relies on our multiple view geometry for plenoptic cameras. Tracking and mapping works directly on the micro images formed by the micro lens array and, therefore, has not to deal with aliasing effects in the spatial domain. The algorithm generates a semi-dense depth map based on correspondences between subsequent light-field frames, while taking differently focused micro images into account. Up to our knowledge, it is the first method that performs tracking and mapping for plenoptic cameras directly on the micro images. DPO outperforms state-of-the-art direct monocular simultaneous localization and mapping (SLAM) algorithms and can compete in accuracy with latest stereo SLAM approaches, while supplying much more detailed point clouds.

[1]  Stefan B. Williams,et al.  Decoding, Calibration and Rectification for Lenselet-Based Plenoptic Cameras , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[2]  Pat Hanrahan,et al.  Digital correction of lens aberrations in light field photography , 2006, International Optical Design Conference.

[3]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[4]  Qionghai Dai,et al.  Light Field Image Processing: An Overview , 2017, IEEE Journal of Selected Topics in Signal Processing.

[5]  Yu-Wing Tai,et al.  Modeling the Calibration Pipeline of the Lytro Camera for High Quality Light-Field Image Reconstruction , 2013, 2013 IEEE International Conference on Computer Vision.

[6]  Uwe Stilla,et al.  Depth estimation and camera calibration of a focused plenoptic camera for visual odometry , 2016 .

[7]  Tom E. Bishop,et al.  The Light Field Camera: Extended Depth of Field, Aliasing, and Superresolution , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Dean Brown,et al.  Decentering distortion of lenses , 1966 .

[9]  Daniel Cremers,et al.  Semi-dense visual odometry for AR on a smartphone , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[10]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[11]  Davide Scaramuzza,et al.  SVO: Fast semi-direct monocular visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Stephan Hussmann,et al.  Automated Robust Metric Calibration Algorithm for Multifocus Plenoptic Cameras , 2016, IEEE Transactions on Instrumentation and Measurement.

[13]  Daniel Cremers,et al.  Dense visual SLAM for RGB-D cameras , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Uwe Stilla,et al.  Establishing a Probabilistic Depth Map from Focused Plenoptic Cameras , 2015, 2015 International Conference on 3D Vision.

[15]  Lennart Wietzke,et al.  Single lens 3D-camera with extended depth-of-field , 2012, Electronic Imaging.

[16]  Tom Drummond,et al.  Edge landmarks in monocular SLAM , 2009, Image Vis. Comput..

[17]  Uwe Stilla,et al.  Calibration and accuracy analysis of a focused plenoptic camera , 2014 .

[18]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[19]  Jörg Stückler,et al.  Large-scale direct SLAM with stereo cameras , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Andrew Lumsdaine,et al.  The focused plenoptic camera , 2009, 2009 IEEE International Conference on Computational Photography (ICCP).

[21]  Vladan Velisavljevic,et al.  Baseline of virtual cameras acquired by a standard plenoptic camera setup , 2014, 2014 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON).

[22]  Uwe Stilla,et al.  METRIC CALIBRATION OF A FOCUSED PLENOPTIC CAMERA BASED ON A 3D CALIBRATION TARGET , 2016 .

[23]  Ryad Benosman,et al.  Plenoptic cameras in real-time robotics , 2013, Int. J. Robotics Res..

[24]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[25]  Bastian Goldlücke,et al.  On Linear Structure from Motion for Light Field Cameras , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[26]  Andrew J. Davison,et al.  DTAM: Dense tracking and mapping in real-time , 2011, 2011 International Conference on Computer Vision.

[27]  Bastian Goldlücke,et al.  On the Calibration of Focused Plenoptic Cameras , 2013, Time-of-Flight and Depth Imaging.

[28]  P. Hanrahan,et al.  Digital light field photography , 2006 .

[29]  Javier Civera,et al.  Using superpixels in monocular SLAM , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[30]  Stefan B. Williams,et al.  Plenoptic flow: Closed-form visual odometry for light field cameras , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  Frederick R. Forst,et al.  On robust estimation of the location parameter , 1980 .

[32]  Uwe Stilla,et al.  Filtering Probabilistic Depth Maps Received from a Focused Plenoptic Camera , 2015 .

[33]  Daniel Cremers,et al.  Semi-dense Visual Odometry for a Monocular Camera , 2013, 2013 IEEE International Conference on Computer Vision.

[34]  Daniel Cremers,et al.  Robust odometry estimation for RGB-D cameras , 2013, 2013 IEEE International Conference on Robotics and Automation.

[35]  In-So Kweon,et al.  Geometric Calibration of Micro-Lens-Based Light Field Cameras Using Line Features , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[36]  Uwe Stilla,et al.  A Real-Time Depth Estimation Approach for a Focused Plenoptic Camera , 2015, ISVC.

[37]  Edward H. Adelson,et al.  Single Lens Stereo with a Plenoptic Camera , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[38]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[39]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[40]  S. Shankar Sastry,et al.  An Invitation to 3-D Vision , 2004 .

[41]  A. Lumsdaine Full Resolution Lightfield Rendering , 2008 .