Development of a virtual reality teleconference system using distributed depth sensors

This paper presents a virtual reality (VR) system for the real-time teleconference. We develop a distributed depth sensors system that can reconstruct 3D images of users, and create a panorama image of the conference room in real time. As a result, users at remote locations can have a teleconference in a virtual environment by wearing VR headsets. The contributions of this work include development of a two-level sensor calibration and data fusion scheme; improvement of image quality through the point-to-mesh conversion; development of a distributed sensing and computing architecture. The developed system is advantageous in low cost, scalability for extension, and high performance in VR reconstruction. The developed system has been validated through a number of experiments.