A System for Distributed Multi-camera Capture and Processing

This contribution describes a distributed multi-camera capture and processing system for real-time media production applications. Its main design purpose is to allow prototyping of distributed processing algorithms for free-viewpoint applications, but the concept can be adapted to other (multicamera) applications. The system integrates broadcast components into a distributed IT-based processing system. The problems that are addressed in this contribution are the synchronisation of these sub-systems and the data and control flow. For the synchronisation of the broadcast and IT components we developed a time stamp mechanism based on tightly synchronised PC clocks. This time is used as the master clock of the system. The video frames from the gen-locked multi-camera streams are implicitly synchronised by adjusting their relative timing against this system clock. For the purpose of synchronising the PC clocks two protocols (NTP and PTP) were evaluated. The second main contribution is the processing of multi-camera data in the distributed system. For this purpose a software framework was developed based on a distributed concurrency system implemented in Python.

[1]  Michael Bosse,et al.  Unstructured lumigraph rendering , 2001, SIGGRAPH.

[2]  Yizhou Yu,et al.  Efficient View-Dependent Image-Based Rendering with Projective Texture-Mapping , 1998, Rendering Techniques.

[3]  G. A. Thomas,et al.  Real-Time Camera Pose Estimation for Augmenting Sports Scenes , 2006 .

[4]  Takeo Kanade,et al.  Virtual ized reality: constructing time-varying virtual worlds from real world events , 1997 .

[5]  Anselmo Lastra,et al.  Combining approximate geometry with view-dependent texture mapping - a hybrid approach to 3D video teleconferencing , 2002, Proceedings. XV Brazilian Symposium on Computer Graphics and Image Processing.

[6]  Anthony E. Cawkell,et al.  Understanding Virtual Reality , 2003, J. Documentation.

[7]  George Coulouris,et al.  Distributed systems - concepts and design , 1988 .

[8]  Ian Wesley-Smith,et al.  Implementation of DXT Compression for UltraGrid , 2008 .

[9]  Oliver Grau,et al.  A combined studio production system for 3-D capturing of live action and immersive actor feedback , 2004, IEEE Transactions on Circuits and Systems for Video Technology.

[10]  Takeo Kanade,et al.  Virtualized reality: constructing time-varying virtual worlds from real world events , 1997, Proceedings. Visualization '97 (Cat. No. 97CB36155).

[11]  William R. Sherman,et al.  Understanding Virtual RealityInterface, Application, and Design , 2002, Presence: Teleoperators & Virtual Environments.

[12]  Larry S. Davis,et al.  A distributed system for real-time volume reconstruction , 2000, Proceedings Fifth IEEE International Workshop on Computer Architectures for Machine Perception.

[13]  Sergio A. Velastin,et al.  Intelligent distributed surveillance systems: a review , 2005 .

[14]  J. M. P. Van Waveren Real-Time DXT Compression , 2006 .

[15]  O. Grau A 3D production pipeline for special effects in TV and film , 2005 .

[16]  J. Starck,et al.  A Robust Free-Viewpoint Video System for Sport Scenes , 2007, 2007 3DTV Conference.

[17]  G. A. Thomas,et al.  A versatile camera position measurement system for virtual reality TV production , 1997 .