Light field transfer: global illumination between real and synthetic objects

We present a novel image-based method for compositing real and synthetic objects in the same scene with a high degree of visual realism. Ours is the first technique to allow global illumination and near-field lighting effects between both real and synthetic objects at interactive rates, without needing a geometric and material model of the real scene. We achieve this by using a light field interface between real and synthetic components---thus, indirect illumination can be simulated using only two 4D light fields, one captured from and one projected onto the real scene. Multiple bounces of interreflections are obtained simply by iterating this approach. The interactivity of our technique enables its use with time-varying scenes, including dynamic objects. This is in sharp contrast to the alternative approach of using 6D or 8D light transport functions of real objects, which are very expensive in terms of acquisition and storage and hence not suitable for real-time applications. In our method, 4D radiance fields are simultaneously captured and projected by using a lens array, video camera, and digital projector. The method supports full global illumination with restricted object placement, and accommodates moderately specular materials. We implement a complete system and show several example scene compositions that demonstrate global illumination effects between dynamic real and synthetic objects. Our implementation requires a single point light source and dark background.

[1]  Marc Levoy,et al.  Symmetric photography: exploiting data-sparseness in reflectance fields , 2006, EGSR '06.

[2]  A. Fournier,et al.  Common Illumination between Real and Computer Generated Scenes , 1992 .

[3]  Paul Debevec Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography , 2008, SIGGRAPH Classes.

[4]  Shree K. Nayar,et al.  Making one object look like another: controlling appearance using a projector-camera system , 2004, CVPR 2004.

[5]  Paul Debevec,et al.  Inverse global illumination: Recovering re?ectance models of real scenes from photographs , 1998 .

[6]  Ruigang Yang,et al.  Toward the Light Field Display: Autostereoscopic Rendering via a Cluster of Projectors , 2008, IEEE Transactions on Visualization and Computer Graphics.

[7]  Simon Gibson,et al.  Interactive Rendering with Real-World Illumination , 2000, Rendering Techniques.

[8]  Katsushi Ikeuchi,et al.  Acquiring a Radiance Distribution to Superimpose Virtual Objects onto Real Scene , 2001, MVA.

[9]  Wojciech Matusik,et al.  3D TV: a scalable system for real-time acquisition, transmission, and autostereoscopic display of dynamic scenes , 2004, ACM Trans. Graph..

[10]  Pieter Peers,et al.  Relighting with 4D incident light fields , 2003, ACM Trans. Graph..

[11]  Steve Marschner,et al.  Dual photography , 2005, ACM Trans. Graph..

[12]  Céline Loscos,et al.  Classification of Illumination Methods for Mixed Reality , 2006, Comput. Graph. Forum.

[13]  Greg Welch,et al.  Shader Lamps: Animating Real Objects With Image-Based Illumination , 2001, Rendering Techniques.

[14]  Bruno Arnaldi,et al.  On the Division of Environments by Virtual Walls for Radiosity Computation , 1994 .

[15]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Ramesh Raskar,et al.  Fast separation of direct and global components of a scene using high frequency illumination , 2006, ACM Trans. Graph..

[17]  Pat Hanrahan,et al.  A signal-processing framework for inverse rendering , 2001, SIGGRAPH.

[18]  Shree K. Nayar,et al.  A Projector-Camera System with Real-Time Photometric Adaptation for Dynamic Environments , 2005, CVPR.

[19]  Andrew Gardner,et al.  Capturing and Rendering with Incident Light Fields , 2003, Rendering Techniques.