Space-time visual effects as a post-production process

Space-time visual effects play an increasingly prominent role in recent motion picture productions as well as TV commercials. Currently, these effects must be meticulously planned before extensive, specialized camera equipment can be precisely positioned and aligned at the set; once recorded, the effect cannot be altered or edited anymore. In this paper, we present an alternative approach to space-time visual effects creation that allows flexible generation and interactive editing of a multitude of different effects during the post-production stage. The approach requires neither expensive, special recording equipment nor elaborate on-set alignment or calibration procedures. Rather, a handful of off-the-shelf camcorders, positioned around a real-world scene suffice, to record the input data. We synthesize various space-time visual effects from unsynchronized, sparse multi-view video footage by making use of recent advances in image interpolation. Based on a representation in a distinct navigation space, our space-time visual effects (STF/X) editor allows us to interactively create and edit on-the-fly various effects such as slow motion, stop motion, freeze-rotate, motion blur, multi-exposure, flash trail and motion distortion. As the input to our approach consists solely of video frames, various image-based artistic stylizations, such as speed lines and particle effects are also integrated into the editor. Finally, different effects can be combined, enabling the creation of new visual effects that are impossible to record with the conventional on-set approach.

[1]  Huamin Wang,et al.  Space-Time Light Field Rendering , 2007, IEEE Transactions on Visualization and Computer Graphics.

[2]  Wojciech Matusik,et al.  Moving gradients: a path-based method for plausible image interpolation , 2009, ACM Trans. Graph..

[3]  Irfan A. Essa,et al.  Video-based nonphotorealistic and expressive illustration of motion , 2005, International 2005 Computer Graphics.

[4]  David Salesin,et al.  Parallax photography: creating 3D cinematic effects from stills , 2009, Graphics Interface.

[5]  Marcus A. Magnor,et al.  Subframe Temporal Alignment of Non-Stationary Cameras , 2008, BMVC.

[6]  Kurt Akeley,et al.  The accumulation buffer: hardware support for high-quality rendering , 1990, SIGGRAPH.

[7]  Leonard McMillan,et al.  Plenoptic Modeling: An Image-Based Rendering System , 2023 .

[8]  Michael Bosse,et al.  Unstructured lumigraph rendering , 2001, SIGGRAPH.

[9]  Thaddeus Beier,et al.  Feature-based image metamorphosis , 1992, SIGGRAPH.

[10]  Marcus A. Magnor,et al.  Perception-motivated interpolation of image sequences , 2008, TAP.

[11]  Mark J. P. Wolf SPACE, TIME, FRAME, CINEMA , 2006 .

[12]  Leonard McMillan,et al.  Post-rendering 3D warping , 1997, SI3D.

[13]  Sung Yong Shin,et al.  Polymorph: Morphing Among Multiple Images , 1998, IEEE Computer Graphics and Applications.

[14]  Baining Guo,et al.  Feature-based light field morphing , 2002, ACM Trans. Graph..

[15]  Steven M. Seitz,et al.  Photo tourism: exploring photo collections in 3D , 2006, ACM Trans. Graph..

[16]  Marcus A. Magnor,et al.  View and Time Interpolation in Image Space , 2008, Comput. Graph. Forum.

[17]  Lance Williams,et al.  View Interpolation for Image Synthesis , 1993, SIGGRAPH.

[18]  Richard Szeliski,et al.  High-quality video view interpolation using a layered representation , 2004, SIGGRAPH 2004.

[19]  Steven M. Seitz,et al.  View morphing , 1996, SIGGRAPH.

[20]  Guillermo Sapiro,et al.  Video SnapCut: robust video object cutout using localized classifiers , 2009, SIGGRAPH 2009.

[21]  Gerhard Weikum,et al.  The LRU-K page replacement algorithm for database disk buffering , 1993, SIGMOD Conference.

[22]  Frédo Durand,et al.  Frequency analysis and sheared reconstruction for rendering motion blur , 2009, ACM Trans. Graph..