Simulation of Blur in Stereoscopic Image Synthesis for Virtual Reality

This paper proposes a method permitting to generate synthesized images including blurred area in real time in order to help stereoscopic perception in virtual reality systems. At any time of the process, the combined knowledge of the scenario and the position of the user in front of the screen allows to select automatically the important zone of the scene. The elements associated to the clear zone catch the attention of the user while the blurred areas avoid an excessive eyestrain. First, we present several methods permitting to simulate blur effects produced by a thin lens in image synthesis. Then we develop our work on the adaptation of the previous methods in the stereoscopic images context, illustrated by results generated in our own virtual reality system. Finally, we propose different solutions for the treatment of the interdependence between scenario and interactivity in image synthesis animations.