Hybrid Rendering for Interactive Virtual Scenes

Interactive virtual environments used in conjunction with haptic displays are often staticviewpoint scenes that contain a mixture of static and dynamic virtual objects. The immersive realism of these environments is often limited by the graphical rendering system, typically OpenGL or Direct3D. In order to present more realistic scenes for haptic interaction without requiring additional modeling complexity, we have developed a technique for co-locating a prerendered, raytraced scene with objects rendered graphically and haptically in real-time. We describe the depth-buffering and perspective techniques that were necessary to achieve colocation among representations, and we demonstrate real-time haptic interaction with a scene rendered using photon-mapping.