Augmenting film and video footage with sensor data

With the advent of tiny networked devices, Mark Weiser's vision of a world embedded with invisible computers is coming to age. Due to their small size and relative ease of deployment, sensor networks have been utilized by zoologists, seismologists and military personnel. In this paper, we investigate the application of sensor networks to the film industry. In particular, we are interested in augmenting film and video footage with sensor data. Unobtrusive sensors are deployed on a film set or in a television studio and on performers. During a filming of a scene, sensor data such as light intensity, color temperature and location are collected and synchronized with each film or video frame. Later, editors, graphics artists and programmers can view this data in synchronization with film and video playback. For example, such data can help define a new level of seamless integration between computer graphics and real world photography. A real-time version of our system would allow sensor data to trigger camera movement and cue special effects. In this paper, we discuss the design and implementation of the first part of our embedded film set environment, the augmented recording system. Augmented recording is a foundational component for the UCLA Hypermedia Studio's research into the use of sensor networks in film and video production. In addition, we have evaluated our system in a television studio.

[1]  Gregory D. Abowd,et al.  Classroom 2000: An Experiment with the Instrumentation of a Living Educational Environment , 1999, IBM Syst. J..

[2]  EstrinDeborah,et al.  Connecting the Physical World with Pervasive Networks , 2002 .

[3]  Beth Stauffer,et al.  Detection and Identification of Marine Microorganisms , 2005 .

[4]  EstrinDeborah,et al.  Fine-grained network time synchronization using reference broadcasts , 2002 .

[5]  Miodrag Potkonjak,et al.  Smart kindergarten: sensor-based wireless networks for smart developmental problem-solving environments , 2001, MobiCom '01.

[6]  Mani B. Srivastava,et al.  A Support Infrastructure for the Smart Kindergarten , 2002, IEEE Pervasive Comput..

[7]  Yong Wang,et al.  Energy-efficient computing for wildlife tracking: design tradeoffs and early experiences with ZebraNet , 2002, ASPLOS X.

[8]  Mani B. Srivastava,et al.  Design and implementation of a framework for efficient and programmable sensor networks , 2003, MobiSys '03.

[9]  Jeff Burke Dynamic Performance Spaces for Theater Production , 2002 .

[10]  Paul M. Davis,et al.  Separation of Site Effects and Structural Focusing in the Santa Monica Damage Zone from the Northridge Earthquake , 2002 .

[11]  P. R. Kumar,et al.  Power Control in Ad-Hoc Networks: Theory, Architecture, Algorithm and Implementation of the COMPOW Protocol , 2002 .

[12]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[13]  Deborah Estrin,et al.  A Collaborative Approach to In-Place Sensor Calibration , 2003, IPSN.

[14]  Joseph A. Paradiso,et al.  The magic carpet: physical sensing for immersive environments , 1997, CHI Extended Abstracts.

[15]  A. Murat Tekalp,et al.  On the Tracking of Articulated and Occluded Video Object Motion , 2001, Real Time Imaging.

[16]  Deborah Estrin,et al.  Proceedings of the 5th Symposium on Operating Systems Design and Implementation Fine-grained Network Time Synchronization Using Reference Broadcasts , 2022 .

[17]  J. Heidemann,et al.  A Flexible and Reliable Radio Communication Stack on Motes , 2002 .

[18]  Cormac J. SreenanAT To Use or Avoid Global Clocks ? , 1995 .

[19]  Mubarak Shah,et al.  View-Invariant Representation and Recognition of Actions , 2002, International Journal of Computer Vision.

[20]  Gaurav S. Sukhatme,et al.  Connecting the Physical World with Pervasive Networks , 2002, IEEE Pervasive Comput..

[21]  Alexander Zelinsky,et al.  An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).