Parameterized variety for multi-view multi-exposure image synthesis and high dynamic range stereo reconstruction

Multi-view stereo, novel view synthesis and high dynamic range (HDR) imaging are three pertinent areas of concern for high quality 3D view generation. This paper presents a novel parameterized variety based model that integrates these different domains into one common framework with an envisioned goal, to accommodate multi-view stereo for multiple exposure input views and to render photo-realistic HDR images from arbitrary virtual viewpoints for high quality 3D reconstruction. We extend the parameterized variety approach for rendering presented earlier by Genc and Ponce [1] to handle full perspective cameras. An efficient algebraic framework is proposed to construct an explicit parameterization of the space of all multi-view multi-exposed images. This characterization of differently exposed views allow to simultaneously recover artifacts free HDR images, and reliable depth maps from arbitrary camera viewpoints. High quality, HDR textured 3D model of the scene is obtained using these images and recovered geometry information.

[1]  Yakup Genc,et al.  Image-Based Rendering Using Parameterized Image Varieties , 2004, International Journal of Computer Vision.

[2]  Jitendra Malik,et al.  Recovering high dynamic range radiance maps from photographs , 1997, SIGGRAPH.

[3]  Shree K. Nayar,et al.  What is the space of camera response functions? , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[4]  Peter F. Stiller,et al.  Solving the recognition problem for six lines using the Dixon resultant 1 Expanded version of talks , 1999 .

[5]  Reinhard Koch,et al.  Metric 3D Surface Reconstruction from Uncalibrated Image Sequences , 1998, SMILE.