View synthesis by the parallel use of GPU and CPU

We present an algorithm for efficient depth calculations and view synthesis. The main goal is the on-line generation of realistic interpolated views of a dynamic scene. The inputs are video-streams originating from two or more calibrated, static cameras. Efficiency is accomplished by the parallel use of the CPU and the GPU in a multi-threaded implementation. The input images are projected on a plane sweeping through 3D space, using the hardware accelerated transformations available on the GPU. A correlation measure is calculated simultaneously for all pixels on the plane and is compared at the different plane positions. A noisy 'virtual' view and a crude depth map result in very limited time. We apply a min-cut/max-flow algorithm on a graph, implemented on the CPU, to ameliorate this result by a global optimisation.

[1]  Luigi di Stefano,et al.  Real-Time Stereo within the VIDET Project , 2002, Real Time Imaging.

[2]  Marc Levoy,et al.  Light field rendering , 1996, SIGGRAPH.

[3]  Vladimir Kolmogorov,et al.  An Experimental Comparison of Min-Cut/Max-Flow Algorithms for Energy Minimization in Vision , 2004, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Long Quan,et al.  Image interpolation by joint view triangulation , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[5]  Václav Hlavác,et al.  Rendering real-world objects using view interpolation , 1995, Proceedings of IEEE International Conference on Computer Vision.

[6]  Larry H. Matthies,et al.  Enhanced real-time stereo using bilateral filtering , 2004, Proceedings. 2nd International Symposium on 3D Data Processing, Visualization and Transmission, 2004. 3DPVT 2004..

[7]  Ramesh Raskar,et al.  Image-based visual hulls , 2000, SIGGRAPH.

[8]  Andrew Blake,et al.  Gaze manipulation for one-to-one teleconferencing , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[9]  Ruigang Yang,et al.  Multi-resolution real-time stereo on commodity graphics hardware , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[10]  Wojciech Matusik,et al.  Polyhedral Visual Hulls for Real-Time Rendering , 2001, Rendering Techniques.

[11]  Richard Szeliski,et al.  The lumigraph , 1996, SIGGRAPH.

[12]  Yizhou Yu,et al.  Efficient View-Dependent Image-Based Rendering with Projective Texture-Mapping , 1998, Rendering Techniques.

[13]  Heinrich Niemann,et al.  Dense disparity maps in real-time with an application to augmented reality , 2002, Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings..

[14]  Lance Williams,et al.  View Interpolation for Image Synthesis , 1993, SIGGRAPH.

[15]  Richard Szeliski,et al.  High-quality video view interpolation using a layered representation , 2004, SIGGRAPH 2004.

[16]  Stephen M. Smith,et al.  SUSAN—A New Approach to Low Level Image Processing , 1997, International Journal of Computer Vision.

[17]  Richard Szeliski,et al.  A layered approach to stereo reconstruction , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[18]  A. Laurentini,et al.  The Visual Hull Concept for Silhouette-Based Image Understanding , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Steven M. Seitz,et al.  View morphing , 1996, SIGGRAPH.

[20]  Til Aach,et al.  Illumination-Invariant Change Detection Using a Statistical Colinearity Criterion , 2001, DAGM-Symposium.

[21]  Hans-Peter Seidel,et al.  On‐the‐Fly Processing of Generalized Lumigraphs , 2001, Comput. Graph. Forum.