Calibration and correction of vignetting effects with an application to 3D mapping

Cheap RGB-D sensors are ubiquitous in robotics. They typically contain a consumer-grade color camera that suffers from significant optical nonlinearities, often referred to as vignetting effects. For example, in Asus Xtion Live Pro cameras the pixels in the corners are two times darker than those in the center of the image. This deteriorates the visual appearance of 3D maps built with such cameras. We propose a simple calibration method that only requires a sheet of white paper as a calibration object and allows to reliably recover the vignetting response of a camera. We demonstrate calibration results for multiple popular RGB-D sensors and show that removal of vignetting effects using a nonparametric response model results in improved color coherence of the reconstructed maps. Furthermore, we show how to effectively compensate color variations caused by automatic white balance and exposure time control of the camera.

[1]  Jörg Stückler,et al.  Super-resolution Keyframe Fusion for 3D Modeling with High-Quality Textures , 2015, 2015 International Conference on 3D Vision.

[2]  Colin Doutre,et al.  Fast vignetting correction and color matching for panoramic image stitching , 2009, 2009 16th IEEE International Conference on Image Processing (ICIP).

[3]  Pieter Abbeel,et al.  Optimized color models for high-quality 3D scanning , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  Dan B. Goldman,et al.  Vignette and exposure calibration and compensation , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[5]  Marc Pollefeys,et al.  Robust Radiometric Calibration and Vignetting Correction , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  W. Marsden I and J , 2012 .

[7]  Shree K. Nayar,et al.  Determining the Camera Response from Images: What Is Knowable? , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Jiun-Hung Chen,et al.  Vignette and Exposure Calibration and Compensation , 2005, ICCV.

[9]  Jitendra Malik,et al.  Recovering high dynamic range radiance maps from photographs , 1997, SIGGRAPH.

[10]  Vladlen Koltun,et al.  Color map optimization for 3D reconstruction with consumer depth cameras , 2014, ACM Trans. Graph..

[11]  Daniel Cremers,et al.  Towards Illumination-Invariant 3D Reconstruction Using ToF RGB-D Cameras , 2014, 2014 2nd International Conference on 3D Vision.

[12]  Shree K. Nayar,et al.  Radiometric self calibration , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[13]  Wonpil Yu,et al.  Practical anti-vignetting methods for digital cameras , 2004, IEEE Trans. Consumer Electron..

[14]  Miguel Oliveira,et al.  A Probabilistic Approach for Color Correction in Image Mosaicking Applications , 2015, IEEE Transactions on Image Processing.

[15]  Andrew I. Comport,et al.  3D High Dynamic Range dense visual SLAM and its application to real-time object re-lighting , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[16]  R. Stephenson A and V , 1962, The British journal of ophthalmology.

[17]  John J. Leonard,et al.  Real-time large-scale dense RGB-D SLAM with volumetric fusion , 2014, Int. J. Robotics Res..

[18]  Matthias Nießner,et al.  BundleFusion , 2016, TOGS.

[19]  Stefan Leutenegger,et al.  ElasticFusion: Dense SLAM Without A Pose Graph , 2015, Robotics: Science and Systems.