Real time stereo rendering for augmented reality on 3DTV system

Recent computer vision technology has made great progress in 3DTV industry. The advent of 3DTV to the home raises a number of challenges beyond the basic display of an available stereo pair. One of them is the production of 3D augmented contents. To present virtual objects over the real world, the existing method is offline processing in postproduction. It is a difficult and expensive work since flexible equipments are required and the cost of postproduction is high. In this paper, low cost approaches to product 3D augmented contents in real time are proposed. In order to make such application running in real time, we use OpenGL to render the stereo pair of 3D augmented contents. Specific rendering order is needed to avoid the flicker of compositing image. As camera moves, the image of virtual objects must be generated and combined perspectively correct with the real image. We use marker tracking technology for real time camera tracking. The experimental results show that our method is suitable for rendering the stereo pair of 3D augmented contents in real time. Running on an NVIDIA Quadro FX4800 graphic card, for each 640×360 stereo pair of 3D augmented contents, the proposed method reaches the speed of 16 ms for rendering two views.

[1]  Takeo Kanade,et al.  Virtualized Reality : Digitizing a 3D Time-Varying Event As Is and in Real Time , 1999 .

[2]  Michael Lehmann,et al.  An all-solid-state optical range camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger) , 2004, SPIE Optical Systems Design.

[3]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[4]  Hideo Saito,et al.  Augmented reality for 3D TV using depth camera input , 2010, 2010 16th International Conference on Virtual Systems and Multimedia.

[5]  Marc Levoy,et al.  Using plane + parallax for calibrating dense camera arrays , 2004, CVPR 2004.

[6]  Rafael Monroy,et al.  Disparity-Aware Stereo 3D Production Tools , 2011, 2011 Conference for Visual Media Production.

[7]  Marc Levoy,et al.  Using plane + parallax for calibrating dense camera arrays , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[8]  John C. Hart,et al.  The CAVE: audio visual experience automatic virtual environment , 1992, CACM.

[9]  Reinhard Koch,et al.  MixIn3D: 3D Mixed Reality with ToF-Camera , 2009, Dyn3D.