4D view synthesis: navigating through time and space

In this sketch, we present a 4D view synthesis technique for rendering large-scale 3D structures evolving in time, given a sparse sample of historical images. We built a system to visualize urban structure that is a function of the time selected, thereby allowing virtual navigation in space and time. While there is a rich literature on image-based rendering of static 3D environments, e.g., the Facade system [Debevec et al. 1996] and Photo Tourism [Snavely et al. 2006], little has been done to address the temporal aspect (e.g., occlusion due to temporal change). We construct time-dependent geometryto handle the sparse sampling. To render, we use time-andview-dependent texture mapping and reason about visibility both in time and space. Figure 1 shows the result of view synthesis based on time-dependent geometry.

[1]  Frank Dellaert,et al.  Inferring Temporal Order of Images From 3D Structure , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[2]  Jitendra Malik,et al.  Modeling and Rendering Architecture from Photographs: A hybrid geometry- and image-based approach , 1996, SIGGRAPH.

[3]  Steven M. Seitz,et al.  Photo tourism: exploring photo collections in 3D , 2006, ACM Trans. Graph..