Telepresence Using Multiple Omni-directional Videos

The advent of high-speed network and high performance PCs has prompted research into networked telepresence, which allows a user to see a virtualized real scene in remote places. View-dependent representation, which provides a user with arbitrary images using an HMD or an immersive display, is especially effective in creating a rich telepresence. The goal of this study is to create a novel view telepresence that enables a user to control the viewpoint and view-direction by virtualizing real dynamic environments. We describe a novel method of generating views that uses image-based rendering techniques from multiple omni-directional images captured from different positions and that evaluates image quality using a simulated environment. We also describe our prototype system and an experiment with the novel view telepresence that used the system in a real environment. Our prototype novel view telepresence system constructs a virtualized environment from real live videos. The system synthesizes a view based on the user's viewpoint and view-direction as measured by a magnetic sensor attached to an HMD and presents the generated view on the HMD. Our system can generate a user's view in real-time by presenting corresponding points and estimating camera parameters in advance.