3D-LIVE: D3.4: Final prototype of the 3D-LIVE platform

The 3D-LIVE platform has been co-designed with end-users in an iterative way. 3D-LIVE partners have continuously worked on improving the components of this platform and their interoperability. We are presenting in this document the final version of the platform that was experimented with end users. Basically the platform can be described as follows: Indoor users and outdoor users are running a set of technologies allowing them to be tele-immersed in one shared virtual environment. There are three main groups of components for both indoor and outdoor setups which are Acquisition, User Applications and Rendering. In the acquisition group, sensors track the activity of the users and transmit it to the user applications in order to generate a consistent representation of the user in the game. Once processed into the virtual scene, applications are capable of rendering it to different rendering devices depending on the setup. Those will be common devices like computer displays, Smartphones as well as immersive devices such as CAVE or Head Mounted Displays or smart goggles. The user applications, running on different platforms (Smartphones or computers) communicate through a server like an online multiplayer game. External data exchange has finally been set up in order to monitor two types of data through two different tools. One tool handles weather information aggregation and queries (including Environment Observation Service and Environment Reconstruction Service). The second tool, called ExperiMonitor, handles experimental data, collecting and monitoring measurements from the different user applications.