Evaluating stereo vision and user tracking in mixed reality tasks!

Advances in head tracking and stereoscopic visualization technologies have fostered the implementation of subjective display systems able to render a 3D scene perspective-corrected according to the position of the user. This enables a whole class of mixed reality applications and interaction paradigms, where the user is able to move freely around the scene and to perform tasks involving the interplay between physical and virtual objects. The accuracy and ergonomics of such tasks strongly depend on the ability of the subjective display system to offer not only a convincing 3D visual experience, but also, and mostly, an accurate rendering of the virtual scene in terms of spatial and metric relations between virtual and physical scene components. In this paper we study the role and impact of head tracking and stereo visualization in mixed reality contexts using a set of measuring tasks involving physical rulers and virtual objects, performed under different rendering conditions. Specifically, we analyze to what extent the two features contribute to give the user the correct alignment between the virtual and the real components of a 3D scene. Finally, we draw some conclusions about their impact within different scenarios.

[1]  Steven K. Feiner,et al.  Perceptual issues in augmented reality revisited , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[2]  Ross Eldridge,et al.  Stereo Vision for Unrestricted Human-Computer Interaction , 2008 .

[3]  Bernd Fröhlich,et al.  The Responsive Workbench: A Virtual Work Environment , 1995, Computer.

[4]  Bernd Fröhlich,et al.  The two-user Responsive Workbench: support for collaboration through individual views of a shared space , 1997, SIGGRAPH.

[5]  Andrea Torsello,et al.  RUNE-Tag: A high accuracy fiducial marker with strong occlusion resilience , 2011, CVPR 2011.

[6]  Chiuhsiang Joe Lin,et al.  Distance estimation with mixed real and virtual targets in stereoscopic displays , 2015, Displays.

[7]  D. W. F. van Krevelen,et al.  A Survey of Augmented Reality Technologies, Applications and Limitations , 2010, Int. J. Virtual Real..

[8]  Will Spijkers,et al.  Depth Perception in Virtual Reality: Distance Estimations in Peri- and Extrapersonal Space , 2008, Cyberpsychology Behav. Soc. Netw..

[9]  Paul Milgram,et al.  Perceptual issues in augmented reality , 1996, Electronic Imaging.

[10]  Ingmar S. Franke,et al.  Angle of view vs. perspective distortion: a psychological evaluation of perspective projection for achieving perceptual realism in computer graphics , 2008, APGV '08.

[11]  E. Sture Eriksson Movement parallax during locomotion , 1974 .

[12]  Andrea Torsello,et al.  Pi-Tag: a fast image-space marker design based on projective invariants , 2013, Machine Vision and Applications.

[13]  Lindsey Joyce,et al.  The pursuit theory of motion parallax , 2006, Vision Research.

[14]  Michael Deering,et al.  High resolution virtual reality , 1992, SIGGRAPH.

[15]  William Ribarsky,et al.  A Geometric Comparison of Algorithms for Fusion Control in Stereoscopic HTDs , 2002, IEEE Trans. Vis. Comput. Graph..

[16]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[17]  Eric D. Ragan,et al.  Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small-Scale Spatial Judgment Task , 2013, IEEE Transactions on Visualization and Computer Graphics.

[18]  Tobias Höllerer,et al.  A hand-held AR magic lens with user-perspective rendering , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[19]  Paul Coulton,et al.  Evaluating dual-view perceptual issues in handheld augmented reality: device vs. user perspective rendering , 2013, ICMI '13.

[20]  Victoria Interrante,et al.  Distance Perception in Immersive Virtual Environments, Revisited , 2006, IEEE Virtual Reality Conference (VR 2006).

[21]  Andrea Torsello,et al.  A game-theoretic approach to fine surface registration without initial motion estimation , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[22]  Carl Gutwin,et al.  E-conic: a perspective-aware interface for multi-display environments , 2007, UIST.

[23]  Uwe Kloos,et al.  Egocentric distance judgments in a large screen display immersive virtual environment , 2010, APGV '10.

[24]  William Ribarsky,et al.  An Analytic Comparison of Alpha-False Eye Separation, Image Scaling and Image Shifting in Stereoscopic Displays , 2000 .

[25]  Andrea Torsello,et al.  Design and Evaluation of a Viewer-Dependent Stereoscopic Display , 2014, 2014 22nd International Conference on Pattern Recognition.

[26]  Bernd Fröhlich,et al.  The Responsive Workbench [virtual work environment] , 1994, IEEE Computer Graphics and Applications.

[27]  Kellogg S. Booth,et al.  Fish tank virtual reality , 1993, INTERCHI.

[28]  Carolina Cruz-Neira,et al.  Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE , 2023 .

[29]  Yu-Ting Lin,et al.  Evaluation of Mono/Binocular Depth Perception Using Virtual Image Display , 2013, HCI.

[30]  Mark Nawrot,et al.  Modeling depth from motion parallax with the motion/pursuit ratio , 2014, Front. Psychol..

[31]  William Ribarsky,et al.  Balancing fusion, image depth and distortion in stereoscopic head-tracked displays , 1999, SIGGRAPH.