Interactive Near-Field Illumination for Photorealistic Augmented Reality with Varying Materials on Mobile Devices

At present, photorealistic augmentation is not yet possible since the computational power of mobile devices is insufficient. Even streaming solutions from stationary PCs cause a latency that affects user interactions considerably. Therefore, we introduce a differential rendering method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding delays. The computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the participants. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view. While our method focuses on Lambertian materials, we also provide some initial approaches to approximate non-diffuse virtual objects and thereby allow for a wider field of application at nearly the same cost.

[1]  Jan Kautz,et al.  Precomputed radiance transfer for real-time rendering in dynamic, low-frequency lighting environments , 2002 .

[2]  A. Fournier,et al.  Common Illumination between Real and Computer Generated Scenes , 1992 .

[3]  Kenny Mitchell,et al.  Light factorization for mixed-frequency shadows in augmented reality , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[4]  Miika Aittala,et al.  Inverse lighting and photorealistic rendering for augmented reality , 2010, The Visual Computer.

[5]  Toby Howard,et al.  Rapid Shadow Generation in Real-World Lighting Environments , 2003, Rendering Techniques.

[6]  Paul Debevec Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography , 2008, SIGGRAPH Classes.

[7]  Alexander Keller,et al.  Instant radiosity , 1997, SIGGRAPH.

[8]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[9]  Simon Gibson,et al.  Interactive Rendering with Real-World Illumination , 2000, Rendering Techniques.

[10]  Pierre Poulin,et al.  Interactive Virtual Relighting and Remodeling of Real Scenes , 1999, Rendering Techniques.

[11]  David A. Forsyth,et al.  Rendering synthetic objects into legacy photographs , 2011, ACM Trans. Graph..

[12]  Tobias Alexander Franke,et al.  Delta Light Propagation Volumes for mixed reality , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[13]  Christian Sandor,et al.  Global illumination for Augmented Reality on mobile phones , 2014, 2014 IEEE Virtual Reality (VR).

[14]  Werner Hartmann,et al.  A real-time shadow approach for an augmented reality application using shadow volumes , 2003, VRST '03.

[15]  George Drettakis,et al.  Interactive Common Illumination for Computer Augmented Reality , 1997, Rendering Techniques.

[16]  Martin Knecht,et al.  Differential Instant Radiosity for mixed reality , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[17]  Hans-Peter Seidel,et al.  Importance sampling for video environment maps , 2005, SIGGRAPH '05.

[18]  Thorsten Grosch PanoAR: INTERACTIVE AUGMENTATION OF OMNI-DIRECTIONAL IMAGES WITH CONSISTENT LIGHTING , 2005 .

[19]  Hannes Kaufmann,et al.  High-quality reflections, refractions, and caustics in Augmented Reality and their contribution to visual coherence , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[20]  Céline Loscos,et al.  Classification of Illumination Methods for Mixed Reality , 2006, Comput. Graph. Forum.

[21]  Martin Knecht,et al.  Reciprocal shading for mixed reality , 2012, Comput. Graph..

[22]  Hans-Peter Seidel,et al.  Real-time Indirect Illumination with Clustered Visibility , 2009, VMV.

[23]  Hannes Kaufmann,et al.  Differential Progressive Path Tracing for High-Quality Previsualization and Relighting in Augmented Reality , 2013, ISVC.

[24]  Thorsten Grosch,et al.  Differential Photon Mapping - Consistent Augmentation of Photographs with Correction of all Light Paths , 2005, Eurographics.

[25]  Naokazu Yokoya,et al.  Geometric and photometric registration for real-time augmented reality , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[26]  James T. Kajiya,et al.  The rendering equation , 1998 .

[27]  Anders Ynnerman,et al.  Free Form Incident Light Fields , 2008, Comput. Graph. Forum.

[28]  Naty Hoffman,et al.  Physically-Based Shading Models in Film and Game Production , 2010 .

[29]  Donald P. Greenberg,et al.  The Irradiance Volume , 1998, IEEE Computer Graphics and Applications.

[30]  Akira Kojima,et al.  The hand as a shading probe , 2013, SIGGRAPH '13.

[31]  Kenny Mitchell,et al.  The shading probe: fast appearance acquisition for mobile AR , 2013, SA '13.

[32]  Claus B. Madsen,et al.  Outdoor Illumination Estimation in Image Sequences for Augmented Reality , 2011, GRAPP.

[33]  Paolo Cignoni,et al.  Stereo Light Probe , 2008, Comput. Graph. Forum.

[34]  Li Li,et al.  Photorealistic rendering for augmented reality using environment illumination , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[35]  Thorsten Grosch,et al.  Interactive Augmentation of Live Images using a HDR Stereo Camera , 2007, J. Virtual Real. Broadcast..

[36]  Veronica Teichrieb,et al.  Photorealistic rendering for Augmented Reality: A global illumination and BRDF solution , 2010, 2010 IEEE Virtual Reality Conference (VR).

[37]  Pat Hanrahan,et al.  An efficient representation for irradiance environment maps , 2001, SIGGRAPH.

[38]  Andrew J. Davison,et al.  Real-time surface light-field capture for augmentation of planar specular surfaces , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[39]  M. Landy,et al.  The Plenoptic Function and the Elements of Early Vision , 1991 .

[40]  Dieter Schmalstieg,et al.  Real-time photometric registration from arbitrary geometry , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[41]  Thorsten Grosch,et al.  Consistent interactive augmentation of live camera images with correct near-field illumination , 2007, VRST '07.

[42]  Jan Kautz,et al.  The State of the Art in Interactive Global Illumination , 2012, Comput. Graph. Forum.

[43]  Katsushi Ikeuchi,et al.  Acquiring a Radiance Distribution to Superimpose Virtual Objects onto Real Scene , 2001, MVA.

[44]  Philipp Lensing,et al.  Instant indirect illumination for dynamic mixed reality scenes , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[45]  Andrew I. Comport,et al.  3D High Dynamic Range dense visual SLAM and its application to real-time object re-lighting , 2013, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[46]  László Szirmay-Kalos,et al.  Fresnel Term Approximations for Metals , 2005, International Conference in Central Europe on Computer Graphics and Visualization.

[47]  Dieter Schmalstieg,et al.  Efficient and robust radiance transfer for probeless photorealistic augmented reality , 2014, 2014 IEEE Virtual Reality (VR).