Virtual ball catching performance in different camera views

Virtual camera design is an important but tricky part of creating virtual reality experiences; interaction can feel awkward if the camera is not placed exactly at the user's eyes, but on the other hand a 3rd person perspective (3PP) can provide a better view of the environment and/or the avatar. To inform camera design, we contribute the first study that systematically explores and quantifies how interaction difficulty changes when the camera is moved between a natural 1st person perspective (1PP) and a typical 3PP where the camera is behind and above the user. In our experiment, 24 participants catched flying virtual balls in seven different camera views. Catching performance degraded almost linearly as a function of camera distance from 1PP, and adaptation to non-1PP was slow or non-existent after a quick initial and partial adaptation. Our result suggest that natural 1PP should be used whenever possible, and transitions between views should be minimized to minimize the user constantly struggling to adapt. We also discuss how our results can be explained by the relation of camera perspective and retinal optical flow, and what interaction techniques can mitigate 3PP interaction problems.

[1]  Thomas Heinen,et al.  Monocular and binocular vision in the performance of a complex skill. , 2011, Journal of sports science & medicine.

[2]  Hal S. Greenwald,et al.  Cue integration outside central fixation: a study of grasping in depth. , 2009, Journal of vision.

[3]  Daniel Thalmann,et al.  The benefits of third-person perspective in virtual and augmented reality? , 2006, VRST '06.

[4]  Kenneth J. Ciuffreda,et al.  Effect of Binocular versus Monocular Viewing on Golf Putting Accuracy , 2009 .

[5]  Jari Takatalo,et al.  Enhancing spatial perception and user experience in video games with volumetric shadows , 2013, OZCHI.

[6]  Geoffrey S. Hubona,et al.  The relative contributions of stereo, lighting, and background scenes in promoting 3D depth visualization , 1999, TCHI.

[7]  John van der Kamp,et al.  Three- to eight-month-old infants' catching under monocular and binocular vision. , 2006, Human movement science.

[8]  Robert van Liere,et al.  Interacting with molecular structures: user performance versus system complexity , 2005, EGVE'05.

[9]  D J Hannon,et al.  Eye movements and optical flow. , 1990, Journal of the Optical Society of America. A, Optics and image science.

[10]  Gilles Montagne,et al.  The contribution of stereo vision to one-handed catching , 2004, Experimental Brain Research.

[11]  Tadahiko Fukuda,et al.  Visual Search Strategies of Baseball Batters: Eye Movements during the Preparatory Phase of Batting , 2002, Perceptual and motor skills.

[12]  Kellogg S. Booth,et al.  Fish tank virtual reality , 1993, INTERCHI.

[13]  Desney S. Tan,et al.  Women go with the (optical) flow , 2003, CHI '03.

[14]  Heinrich H. Bülthoff,et al.  The Effect of Viewing a Self-Avatar on Distance Judgments in an HMD-Based Virtual Environment , 2010, PRESENCE: Teleoperators and Virtual Environments.

[15]  Kellogg S. Booth,et al.  Evaluating 3D task performance for fish tank virtual worlds , 1993, TOIS.

[16]  William H Warren,et al.  Catching fly balls in virtual reality: a critical test of the outfielder problem. , 2009, Journal of vision.

[17]  Kajal T. Claypool,et al.  Latency and player actions in online games , 2006, CACM.

[18]  K. Davids,et al.  Timing a one-handed catch , 1999, Experimental Brain Research.

[19]  Keith Davids,et al.  Discriminating the role of binocular information in the timing of a one-handed catch , 2000, Experimental Brain Research.

[20]  Daniel Thalmann,et al.  Improved third-person perspective: a solution reducing occlusion of the 3PP? , 2008, VRCAI '08.

[21]  Judge Sj,et al.  Adaptation to telestereoscopic viewing measured by one-handed ball-catching performance. , 1988 .

[22]  Daniel Thalmann,et al.  Quantifying Effects of Exposure to the Third and First-Person Perspectives in Virtual-Reality-Based Training , 2010, IEEE Transactions on Learning Technologies.

[23]  K. Davids,et al.  Timing a one-handed catch , 1999, Experimental Brain Research.

[24]  L. S. Tamkei,et al.  Effect of viewing angle on arm reaching while standing in a virtual environment: potential for virtual rehabilitation. , 2010, Acta psychologica.

[25]  Olivier Chapuis,et al.  Effects of motor scale, visual scale, and quantization on small target acquisition difficulty , 2011, TCHI.

[26]  Franck Multon,et al.  Using virtual reality to analyze links between handball thrower kinematics and goalkeeper's reactions , 2004, Neuroscience Letters.

[27]  C. Michaels,et al.  The information for catching fly balls: judging and intercepting virtual balls in a CAVE. , 2003, Journal of experimental psychology. Human perception and performance.

[28]  Donald P. Greenberg,et al.  Perceiving spatial relationships in computer-generated images , 1992, IEEE Computer Graphics and Applications.

[29]  S. Judge,et al.  Adaptation to Telestereoscopic Viewing Measured by One-Handed Ball-Catching Performance , 1988, Perception.

[30]  Robert J. Teather,et al.  Guidelines for 3D positioning techniques , 2007, Future Play.

[31]  J. Loomis,et al.  Immersive virtual environment technology as a basic research tool in psychology , 1999, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[32]  Ravin Balakrishnan,et al.  Reaching for objects in VR displays: lag and frame rate , 1994, TCHI.

[33]  Richard Rouse What's your perspective? , 1999, COMG.

[34]  Mathias Müller,et al.  The Influence of the Stereo Base on Blind and Sighted Reaches in a Virtual Environment , 2015, TAP.

[35]  Geoffrey S. Hubona,et al.  The effects of cast shadows and stereopsis on performing computer-generated spatial tasks , 2004, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[36]  Richard A. Schmidt,et al.  Motor learning and performance : a situation-based learning approach , 2008 .

[37]  Daniel M Laby,et al.  The Visual Function of Olympic-Level Athletes—An Initial Report , 2011, Eye & contact lens.

[38]  Kellogg S. Booth,et al.  A study of interactive 3D point location in a computer simulated virtual environment , 1997, VRST '97.

[39]  Tuukka M. Takala,et al.  Reality-based User Interface System (RUIS) , 2011, 3DUI.

[40]  Joachim Tesch,et al.  It is all me: the effect of viewpoint on visual–vestibular recalibration , 2011, Experimental Brain Research.

[41]  James E. Cutting,et al.  Chapter 3 – Perceiving Layout and Knowing Distances: The Integration, Relative Potency, and Contextual Use of Different Information about Depth* , 1995 .

[42]  David C Knill,et al.  Reaching for visual cues to depth: the brain combines depth cues differently for motor control and perception. , 2005, Journal of vision.

[43]  James L. Wright,et al.  System-Level MIDI Performance Testing , 2001, ICMC.

[44]  Pirkko Oittinen,et al.  Stereoscopic depth perception in video see-through augmented reality within action space , 2014, J. Electronic Imaging.

[45]  Carl Gutwin,et al.  Effects of view, input device, and track width on video game driving , 2011, Graphics Interface.

[46]  Paul A. Braren,et al.  Wayfinding on foot from information in retinal, not optical, flow. , 1992, Journal of experimental psychology. General.

[47]  G. Savelsbergh,et al.  Stereo vision enhances the learning of a catching skill , 2007, Experimental Brain Research.

[48]  Alfred W. Hubbard,et al.  Visual Movements of Batters , 1954 .

[49]  Darwin G. Caldwell,et al.  Intercepting Virtual Ball in Immersive Virtual Environment , 2011, HCI.

[50]  Harry L. Snyder,et al.  Comparison of depth cues for relative depth judgments , 1990, Other Conferences.

[51]  Richard A. Schmidt,et al.  Motor Learning and Performance , 1991 .