Image compositing system capable of long-range camera movement

We had been studying virtual studios that compose a camera image filmed in a real studio with a real-time CG image driven by camera data obtained from sensors attached to the real camera. In conventional virtual studios, however, the composite image is limited by the physical studio space when filming an actor in front of a chroma-key blue screen. We have designed and developed a new image compositing system capable of operating a camera over a very wide range that is physically impossible by any conventional means. This has been made possible by combining a real-time CG image with a real-time image-processed image of the foreground picture filmed by a motion control camera. The system films the object by moving the motion control camera within its range of physical motion; outside that range, it controls an image processor according to the principle of virtual shooting. In this way, the system realizes real-time camera work over an extremely wide range in a physically limited studio space. In this paper, we present the virtual shooting method mathematically and discuss its limitations when used to replace real shooting. We describe a system based on the results of the study, and introduce some examples of that system applied to TV program production.