Discrete Light Source Estimation from Light Probes for Photorealistic Rendering

Applications like rendering of images using computer graphics methods are usually requiring more sophisticated light models to give better control. Complex scenes in computer generated images are requiring very differentiated light models to give a realistic rendering of the scene. That usually includes a high number of (virtual) light sources to model a scene to reproduce accurate shadows and shadings. In particular in the production of visual effects for movies and TV the real scene lighting needs to be captured very accurately to give a realistic rendering of virtual objects into that scene. In this context the light modeling is usually done manually by skilled artists in a time consuming process. This contribution describes a new technique for estimation of discrete spot light sources. The method uses a consumer grade DSLR camera equipped with a fisheye lens to capture light probe images registered to the scene. From these probe images the geometric and radiometric properties of the dominant light sources in the scene are estimated. The first step is a robust approach to identify light sources in the light probes and to find exact positions by triangulation. Then the light direction and radiometric fall-off properties are formulated and estimated in a least square minimization approach. There are a number of advantages in our approach. First, the probing camera is registered using a multi-camera setup which requires the minimum amendments to the studio. Second, we are not limited to any specific probing object since the properties of each light are estimated based on processing the probe images. In addition, since the probing camera can move freely in the area of interest, there are no limits in terms of the covered space. Large field of view of the fisheye lens is also beneficial in this matter. Calibration and Registration of Cameras. We propose a two-step calibration and registration approach. In the first step, a planar asymmetric calibration pattern is used for simultaneous calibration of the intrinsics and the pose of all the witness cameras and the principal camera using a bundle adjustment module. In the next step, parameters of witness cameras are kept fixed and the probing camera is registered in the same coordinate system by using color features of an attached calibration rig. Position Estimation. To estimate the 3D position vectors of the light sources, one needs to shoot rays from every detected light blob in all probe images and triangulate the corresponding rays from at least two probe positions for each source. Figure 1 summarizes the required steps.

[1]  Serge J. Belongie,et al.  Structured importance sampling of environment maps , 2003, ACM Trans. Graph..

[2]  Simon Fuhrmann,et al.  Geometric Point Light Source Calibration , 2013, VMV.

[3]  Qunsheng Peng,et al.  Light source estimation of outdoor scenes for mixed reality , 2009, The Visual Computer.

[4]  Paul E. Debevec,et al.  A median cut algorithm for light probe sampling , 2005, SIGGRAPH Courses.

[5]  Koch,et al.  Markerless Augmented Reality with Light Source Estimation for Direct Illumination , 2006 .

[6]  Paul Debevec Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography , 2008, SIGGRAPH Classes.

[7]  Dmitry B. Goldgof,et al.  A Simple Strategy for Calibrating the Geometry of Light Sources , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Yang Wang,et al.  Estimation of multiple directional light sources for synthesis of augmented reality images , 2002, Graph. Model..

[9]  Katsushi Ikeuchi,et al.  Light source position and reflectance estimation from a single view without the distant illumination assumption , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Roberto Cipolla,et al.  A Practical Method for Estimation of Point Light-Sources , 2001, BMVC.

[11]  Berthold K. P. Horn Obtaining shape from shading information , 1989 .

[12]  Naokazu Yokoya,et al.  Real-time estimation of light source environment for photorealistic augmented reality , 2004, ICPR 2004.

[13]  Wei Zhou,et al.  A unified framework for scene illuminant estimation , 2008, Image Vis. Comput..

[14]  Zhenwen Dai,et al.  Polygonal Light Source Estimation , 2009, ACCV.

[15]  Juho Kannala,et al.  A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Alexander Keller,et al.  Efficient Illumination by High Dynamic Range Images , 2003, Rendering Techniques.

[17]  Yasuyuki Matsushita,et al.  Calibrating a Non-isotropic Near Point Light Source Using a Plane , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[18]  Atsuto Maki,et al.  Difference Sphere: An Approach to Near Light Source Estimation , 2004, CVPR.