Minimal Warping: Planning Incremental Novel‐view Synthesis

Observing that many visual effects (depth‐of‐field, motion blur, soft shadows, spectral effects) and several sampling modalities (time, stereo or light fields) can be expressed as a sum of many pinhole camera images, we suggest a novel efficient image synthesis framework that exploits coherency among those images. We introduce the notion of “distribution flow” that represents the 2D image deformation in response to changes in the high‐dimensional time‐, lens‐, area light‐, spectral‐, etc. coordinates. Our approach plans the optimal traversal of the distribution space of all required pinhole images, such that starting from one representative root image, which is incrementally changed (warped) in a minimal fashion, pixels move at most by one pixel, if at all. The incremental warping allows extremely simple warping code, typically requiring half a millisecond on an Nvidia Geforce GTX 980Ti GPU per pinhole image. We show, how the bounded sampling does introduce very little errors in comparison to re‐rendering or a common warping‐based solution. Our approach allows efficient previews for arbitrary combinations of distribution effects and imaging modalities with little noise and high visual fidelity.

[1]  D. Robinson,et al.  COMPARISON OF QUASI- AND PSEUDO-MONTE CARLO SAMPLING FOR RELIABILITY AND UNCERTAINTY ANALYSIS , 1999 .

[2]  Hans-Peter Seidel,et al.  Depth-of-field rendering with multiview synthesis , 2009, SIGGRAPH 2009.

[3]  Frédo Durand,et al.  Antialiasing for automultiscopic 3D displays , 2006, EGSR '06.

[4]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[5]  Kurt Akeley,et al.  The accumulation buffer: hardware support for high-quality rendering , 1990, SIGGRAPH.

[6]  Robert L. Cook,et al.  The Reyes image rendering architecture , 1987, SIGGRAPH.

[7]  Nancy Argüelles,et al.  Author ' s , 2008 .

[8]  Jay Torborg,et al.  Talisman: commodity realtime 3D graphics for the PC , 1996, SIGGRAPH.

[9]  Michael Goesele,et al.  An adaptive acceleration structure for screen-space ray tracing , 2015, HPG '15.

[10]  Tomas Akenine-Möller,et al.  Layered Reconstruction for Defocus and Motion Blur , 2014, Comput. Graph. Forum.

[11]  Andrew Lippman,et al.  Movie-maps: An application of the optical videodisc to computer graphics , 1980, SIGGRAPH '80.

[12]  Lei Yang,et al.  Image-based bidirectional scene reprojection , 2011, ACM Trans. Graph..

[13]  Gordon Wetzstein,et al.  Adaptive image synthesis for compressive displays , 2013, ACM Trans. Graph..

[14]  Hans-Peter Seidel,et al.  Real-time Indirect Illumination with Clustered Visibility , 2009, VMV.

[15]  Hans-Peter Seidel,et al.  Adaptive Image-space Stereo View Synthesis , 2010, VMV.

[16]  Hans-Peter Seidel,et al.  Perceptually‐motivated Real‐time Temporal Upsampling of 3D Content for High‐refresh‐rate Displays , 2010, Comput. Graph. Forum.

[17]  Leonard McMillan,et al.  Post-rendering 3D warping , 1997, SI3D.

[18]  Derek Hoiem,et al.  Indoor Segmentation and Support Inference from RGBD Images , 2012, ECCV.

[19]  F. Durand A Frequency Analysis of Light Transport , 2011 .

[20]  Kenny Mitchell,et al.  Iterative Image Warping , 2012, Comput. Graph. Forum.

[21]  Ramesh Raskar,et al.  Computational Photography: Epsilon to Coded Photography , 2009, ETVC.

[22]  Frédo Durand,et al.  5D Covariance tracing for efficient defocus and motion blur , 2013, TOGS.

[23]  Hans-Peter Seidel,et al.  Spectral Ray Differentials , 2014, Comput. Graph. Forum.

[24]  Frédo Durand,et al.  Fast 4D Sheared Filtering for Interactive Rendering of Distribution Effects , 2015, ACM Trans. Graph..

[25]  Robert L. Cook,et al.  Distributed ray tracing , 1998 .

[26]  Tomas Akenine-Möller,et al.  Stochastic rasterization using time-continuous triangles , 2007, GH '07.

[27]  Tomas Akenine-Möller,et al.  An efficient multi-view rasterization architecture , 2006, EGSR '06.

[28]  Richard Szeliski,et al.  Layered depth images , 1998, SIGGRAPH.

[29]  Yves D. Willems,et al.  Path Differentials and Applications , 2001, Rendering Techniques.

[30]  Aswin C. Sankaranarayanan,et al.  Compressive epsilon photography for post-capture control in digital imaging , 2014, ACM Trans. Graph..

[31]  Matthias Zwicker,et al.  Surface splatting , 2001, SIGGRAPH.

[32]  Jaakko Lehtinen,et al.  Temporal light field reconstruction for rendering distribution effects Citation , 2011 .

[33]  Lance Williams,et al.  View Interpolation for Image Synthesis , 1993, SIGGRAPH.

[34]  Jr. Leonard McMillan,et al.  An Image-Based Approach to Three-Dimensional Computer Graphics , 1997 .

[35]  Frédo Durand,et al.  Joint view expansion and filtering for automultiscopic 3D displays , 2013, ACM Trans. Graph..

[36]  Rui Wang,et al.  Real‐time Depth of Field Rendering via Dynamic Light Field Generation and Filtering , 2010, Comput. Graph. Forum.

[37]  Kun Zhou,et al.  RenderAnts: interactive Reyes rendering on GPUs , 2009, SIGGRAPH 2009.

[38]  Jitendra Malik,et al.  View Synthesis by Appearance Flow , 2016, ECCV.

[39]  Torsten Sattler,et al.  SIFT-Realistic Rendering , 2013, 2013 International Conference on 3D Vision.

[40]  Jaakko Lehtinen,et al.  Gradient-domain path tracing , 2015, ACM Trans. Graph..

[41]  Andrew Kensler,et al.  Correlated Multi-Jittered Sampling , 2013 .

[42]  Takeo Kanade,et al.  Three-dimensional scene flow , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[43]  Homan Igehy,et al.  Tracing ray differentials , 1999, SIGGRAPH.