TransCAIP: A Live 3D TV System Using a Camera Array and an Integral Photography Display with Interactive Control of Viewing Parameters

The system described in this paper provides a real-time 3D visual experience by using an array of 64 video cameras and an integral photography display with 60 viewing directions. The live 3D scene in front of the camera array is reproduced by the full-color, full-parallax autostereoscopic display with interactive control of viewing parameters. The main technical challenge is fast and flexible conversion of the data from the 64 multicamera images to the integral photography format. Based on image-based rendering techniques, our conversion method first renders 60 novel images corresponding to the viewing directions of the display, and then arranges the rendered pixels to produce an integral photography image. For real-time processing on a single PC, all the conversion processes are implemented on a GPU with GPGPU techniques. The conversion method also allows a user to interactively control viewing parameters of the displayed image for reproducing the dynamic 3D scene with desirable parameters. This control is performed as a software process, without reconfiguring the hardware system, by changing the rendering parameters such as the convergence point of the rendering cameras and the interval between the viewpoints of the rendering cameras.

[1]  D. Scharstein,et al.  A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms , 2001, Proceedings IEEE Workshop on Stereo and Multi-Baseline Vision (SMBV 2001).

[2]  P. Hanrahan,et al.  Light Field Photography with a Hand-held Plenoptic Camera , 2005 .

[3]  Nobuhiko Hata,et al.  Scalable high-resolution integral videography autostereoscopic display with a seamless multiprojection system. , 2005, Applied optics.

[4]  F. Okano,et al.  Three-dimensional video system based on integral photography , 1999 .

[5]  Naoki Kawakami,et al.  Seelinder: the cylindrical lightfield display , 2005, SIGGRAPH '05.

[6]  Miho Kobayashi,et al.  Autostereoscopic display with 60 ray directions using LCD with optimized color filter layout , 2007, Electronic Imaging.

[7]  Tsuhan Chen,et al.  A Self-Reconfigurable Camera Array , 2004, Rendering Techniques.

[8]  Takeshi Hoshino,et al.  Transpost: a novel approach to the display and transmission of 360 degrees-viewable 3D solid images , 2006, IEEE Transactions on Visualization and Computer Graphics.

[9]  Harry Shum,et al.  Plenoptic sampling , 2000, SIGGRAPH.

[10]  Takeshi Naemura,et al.  Real-Time Video-Based Modeling and Rendering of 3D Scenes , 2002, IEEE Computer Graphics and Applications.

[11]  Ruigang Yang,et al.  Real-time consensus-based scene reconstruction using commodity graphics hardware , 2002, 10th Pacific Conference on Computer Graphics and Applications, 2002. Proceedings..

[12]  Y. Taguchi,et al.  Real-Time All-in-Focus Video-Based Rendering Using A Network Camera Array , 2008, 2008 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video.

[13]  Takeshi Naemura,et al.  TransCAIP: live transmission of light field from a camera array to an integral photography display , 2008, SIGGRAPH Asia '08.

[14]  Wojciech Matusik,et al.  3D TV: a scalable system for real-time acquisition, transmission, and autostereoscopic display of dynamic scenes , 2004, ACM Trans. Graph..

[15]  Michael Bosse,et al.  Unstructured lumigraph rendering , 2001, SIGGRAPH.

[16]  G. Lippmann Epreuves reversibles donnant la sensation du relief , 1908 .

[17]  Roger Y. Tsai,et al.  A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses , 1987, IEEE J. Robotics Autom..

[18]  Ruigang Yang,et al.  Toward the Light Field Display: Autostereoscopic Rendering via a Cluster of Projectors , 2008, IEEE Transactions on Visualization and Computer Graphics.

[19]  Sehoon Yea,et al.  Resampling, Antialiasing, and Compression in Multiview 3-D Displays , 2007, IEEE Signal Processing Magazine.

[20]  Hans-Peter Seidel,et al.  On‐the‐Fly Processing of Generalized Lumigraphs , 2001, Comput. Graph. Forum.

[21]  Leonard McMillan,et al.  A Real-Time Distributed Light Field Camera , 2002, Rendering Techniques.

[22]  Leonard McMillan,et al.  Dynamically reparameterized light fields , 2000, SIGGRAPH.

[23]  Takeo Kanade,et al.  Virtualized Reality: Constructing Virtual Worlds from Real Scenes , 1997, IEEE Multim..

[24]  Makoto Okui,et al.  Integral three-dimensional television using a 2000-scanning-line video system. , 2006, Applied optics.

[25]  Takeshi Naemura,et al.  LIFLET: light field live with thousands of lenslets , 2004, SIGGRAPH '04.

[26]  Takeshi Naemura,et al.  Real-time video-based rendering for augmented spatial communication , 1998, Electronic Imaging.

[27]  Richard Szeliski,et al.  The lumigraph , 1996, SIGGRAPH.

[28]  T. Naemura,et al.  Real-Time Video-Based Rendering for Augumented Spatial Communications , 1999 .

[29]  Ismo Rakkolainen,et al.  A Survey of 3DTV Displays: Techniques and Technologies , 2007, IEEE Transactions on Circuits and Systems for Video Technology.

[30]  F. Okano,et al.  Analysis of resolution limitation of integral photography , 1998 .

[31]  Marc Levoy,et al.  High performance imaging using large camera arrays , 2005, ACM Trans. Graph..

[32]  M. Halle,et al.  3-D Displays and Signal Processing , 2007, IEEE Signal Processing Magazine.

[33]  Marc Levoy,et al.  Light field rendering , 1996, SIGGRAPH.

[34]  Richard Szeliski,et al.  High-quality video view interpolation using a layered representation , 2004, SIGGRAPH 2004.