A schematic eye for virtual environments

This paper presents a schematic eye model designed for use by virtual environments researchers and practitioners. This model, based on a combination of several ophthalmic models, attempts to very closely approximate a user's optical centers and intraocular separation using as little as a single measurement of pupillary distance (PD). Typically, these parameters are loosely approximated based on the PD of the user while converged to some known distance. However, this may not be sufficient for users to accurately perform spatially sensitive tasks in the near field. We investigate this possibility by comparing the impact of several common PD-based models and our schematic eye model on users' ability to accurately match real and virtual targets in depth. This was done using a specially designed display and robotic positioning apparatus that allowed sub-millimeter measurement of target positions and user responses. We found that the schematic eye model resulted in significantly improved real to virtual matches with average accuracy, in some cases, well under 1mm. We also present a novel, low-cost method of accurately measuring PD using an off-the-shelf trial frame and pinhole filters. We validated this method by comparing its measurements against those taken using an ophthalmic autorefractor. Significant differences were not found between the two methods.

[1]  Eric Kolstad,et al.  The Effects of Virtual Reality, Augmented Reality, and Motion Parallax on Egocentric Depth Perception , 2008, VR.

[2]  Hong Hua,et al.  Effects of Optical Combiner and IPD Change for Convergence on Near-Field Depth Perception in an Optical See-Through HMD , 2016, IEEE Transactions on Visualization and Computer Graphics.

[3]  H. K. Lewis Text-Book of Ophthalmology , 1894, Bristol Medico-Chirurgical Journal (1883).

[4]  Fan Xiao,et al.  Display-relative calibration for optical see-through head-mounted displays , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[5]  Peter Willemsen,et al.  Effects of Stereo Viewing Conditions on Distance Perception in Virtual Environments , 2008, PRESENCE: Teleoperators and Virtual Environments.

[6]  Frank W. Newell,et al.  Ophthalmology, principles and concepts , 1969 .

[7]  G Westheimer,et al.  Population distribution of stereoscopic ability , 1993, Ophthalmic & physiological optics : the journal of the British College of Ophthalmic Opticians.

[8]  Rafael Laboissière,et al.  Adaptation of egocentric distance perception under telestereoscopic viewing within reaching space , 2010, Experimental Brain Research.

[9]  Bohyung Han,et al.  Motion Effects Synthesis for 4D Films , 2016, IEEE Transactions on Visualization and Computer Graphics.

[10]  A. Bradley,et al.  The chromatic eye: a new reduced-eye model of ocular chromatic aberration in humans. , 1992, Applied optics.

[11]  William Ribarsky,et al.  Balancing fusion, image depth and distortion in stereoscopic head-tracked displays , 1999, SIGGRAPH.

[12]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[13]  Warren Robinett,et al.  A Computational Model for the Stereoscopic Optics of a Head-Mounted Display , 1991, Presence: Teleoperators & Virtual Environments.

[14]  Y. Yeh,et al.  Limits of Fusion and Depth Judgment in Stereoscopic Color Displays , 1990, Human factors.

[15]  Robert E. Bannon,et al.  SPATIAL SENSE AND MOVEMENTS OF THE EYE , 1948 .

[16]  William Ribarsky,et al.  Characterizing Image Fusion Techniques in Stereoscopic HTDs , 2001, Graphics Interface.

[17]  Michael Deering,et al.  High resolution virtual reality , 1992, SIGGRAPH.

[18]  David M. Hoffman,et al.  The zone of comfort: Predicting visual discomfort with stereo displays. , 2011, Journal of vision.

[19]  Ian P. Howard,et al.  Human visual orientation , 1982 .

[20]  William Ribarsky,et al.  A Geometric Comparison of Algorithms for Fusion Control in Stereoscopic HTDs , 2002, IEEE Trans. Vis. Comput. Graph..

[21]  James Gao,et al.  High-speed switchable lens enables the development of a volumetric stereoscopic display. , 2009, Optics express.

[22]  Richard L. Newman,et al.  Helmet-Mounted Display Design Guide , 1997 .

[23]  J. Edward Swan,et al.  Peripheral visual cues and their effect on the perception of egocentric depth in virtual and augmented environments , 2011 .

[24]  Gurjot Singh,et al.  Near-field depth perception in optical see-though augmented reality , 2013 .

[25]  Johann Benedict Listing,et al.  Beitrag zur physiologischen Optik , 1845 .

[26]  G. Westheimer The Ferrier Lecture, 1992. Seeing depth with two eyes: stereopsis , 1994, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[27]  L. Vinci The notebooks of Leonardo da Vinci. , 1952 .

[28]  Mark T. Bolas Human factors in the design of an immersive display , 1994, IEEE Computer Graphics and Applications.

[29]  Jannick P. Rolland,et al.  Towards Quantifying Depth and Size Perception in Virtual Environments , 1993, Presence: Teleoperators & Virtual Environments.

[30]  T. D. Duane,et al.  Duane's Clinical Ophthalmology , 1993 .

[31]  Neil A. Dodgson,et al.  Variation and extrema of human interpupillary distance , 2004, IS&T/SPIE Electronic Imaging.

[32]  J. Edward Swan,et al.  Peripheral Stimulation and its Effect on Perceived Spatial Scale in Virtual Environments , 2013, IEEE Transactions on Visualization and Computer Graphics.

[33]  Edward Maccurdy The Notebooks Of Leonardo Da Vinci Vol. 2 , 1919 .

[34]  James E. Cutting,et al.  HIGH-PERFORMANCE COMPUTING AND HUMAN VISION I , 2002 .