Ambient fields: representing potential sensory information

It is increasingly apparent that the traditional scene graph is not fulfilling the requirements of real-time interactive systems. The use of a single graph as a representation of the current state of the world means that display systems, that may operate at very different rates, or may need to predict ahead the state, need to be very tightly integrated with behaviour and semantics. In this position paper, we will propose a type of field called the "ambient field" which represents information proximate to the user's senses, which they could sample over short time periods. These fields might represent audio, video, haptic or other potentially sensed information. A display device can then sample these fields as necessary to construct the best representation possible at its own display rate. The ambient field draws on the concept of the ambient optical array from Gibson, light fields from computer graphics rendering and point-based physics simulations.

[1]  James J. Gibson,et al.  The Ecological Approach to Visual Perception: Classic Edition , 2014 .

[2]  Henry Sowizral,et al.  Scene Graphs in the New Millennium , 2000, IEEE Computer Graphics and Applications.

[3]  Suvranu De,et al.  Physically Realistic Virtual Surgery Using the Point-Associated Finite Field (PAFF) Approach , 2006, PRESENCE: Teleoperators and Virtual Environments.

[4]  Tae-Yong Kim,et al.  Unified particle physics for real-time applications , 2014, ACM Trans. Graph..

[5]  Georgi Gaydadjiev,et al.  Construction and Evaluation of an Ultra Low Latency Frameless Renderer for VR , 2016, IEEE Transactions on Visualization and Computer Graphics.

[6]  Bernd Fröhlich,et al.  Advanced Multi-Frame Rate Rendering Techniques , 2008, 2008 IEEE Virtual Reality Conference.

[7]  Marc Levoy,et al.  Light field rendering , 1996, SIGGRAPH.

[8]  David Salesin,et al.  Surface light fields for 3D photography , 2000, SIGGRAPH.