A new approach to outdoor illumination estimation based on statistical analysis for augmented reality

Illumination consistency plays an important role in realistic rendering of virtual characters which are integrated into a live video of real scene. This paper proposes a novel method for estimating the illumination conditions of outdoor videos captured by a fixed viewpoint. We first derive an analytical model which relates the statistics of an image to the lighting parameters of the scene adhering to the basic illumination model. Exploiting this model, we then develop a framework to estimate the lighting conditions of live videos. In order to apply the above approach to scenes containing dynamic objects such as intrusive pedestrians and swaying trees, we enforce two constraints, namely spatial and temporal illumination coherence, to refine the solution. Our approach requires no geometric information of the scenes and is sufficient for real-time performance. Experiments show that with the lighting parameters recovered by our method, virtual characters can be seamlessly integrated into the live video. Copyright © 2010 John Wiley & Sons, Ltd. This paper proposes a novel method for estimating in real time the lighting conditions of outdoor videos, captured under a fixed viewpoint. Experiments show that with the lighting parameters recovered by our method, virtual characters can be seamlessly integrated into a live video. The figure shows one frame of an augmented video in which a cowboy and a wine barrel are integrated into the scene.

[1]  Qunsheng Peng,et al.  Light source estimation of outdoor scenes for mixed reality , 2009, The Visual Computer.

[2]  Sharat Chandran,et al.  A Survey of Image-based Relighting Techniques , 2007, J. Virtual Real. Broadcast..

[3]  Paul Debevec Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography , 2008, SIGGRAPH Classes.

[4]  Reiner Lenz,et al.  Illumination Induced Changes in Image Statistics , 2006, CGIV.

[5]  Alexei A. Efros,et al.  Estimating natural illumination from a single outdoor image , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[6]  Greg Welch,et al.  An Introduction to Kalman Filter , 1995, SIGGRAPH 2001.

[7]  Yang Wang,et al.  Estimation of multiple directional light sources for synthesis of augmented reality images , 2002, Graph. Model..

[8]  Wojciech Matusik,et al.  Factored time-lapse video , 2007, SIGGRAPH 2007.

[9]  Céline Loscos,et al.  Classification of Illumination Methods for Mixed Reality , 2006, Comput. Graph. Forum.

[10]  Katsushi Ikeuchi,et al.  Illumination distribution from shadows , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[11]  Wojciech Matusik,et al.  What do color changes reveal about an outdoor scene? , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Kiriakos N. Kutulakos,et al.  A theory of inverse light transport , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[13]  Claus B. Madsen,et al.  Estimation of Dynamic Light Changes in Outdoor Scenes Without the use of Calibration Objects , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[14]  Kristin J. Dana,et al.  Texture histograms as a function of irradiation and viewing direction , 1999, International Journal of Computer Vision.

[15]  Tomoyuki Nishita,et al.  A montage method: the overlaying of the computer generated images onto a background photograph , 1986, SIGGRAPH.