Perceptual requirements and proposals for the UbiCom augmented reality display

Introduction In this report I will propose some perceptual requirements for the Augmented Reality (AR) display for the first prototype of the UbiCom project. It would be appropriate to define the task to be done with our AR system first (Pasman, 1997; Davis et al. ,1994; Ellis, 1997), but the UbiCom project is mainly a technology-driven project, and therefore there is no single final task that the system has to support. Rather a number of very general tasks should be feasable to run: maintenance and control, route planning, safety/inspection, acquiring information from Internet, building complex (spatial) constructions, playing games such as an adventure, paintball, etc. Consequently, we will have to be able to reach the highest performance levels required for each of these tasks. For example, for route planning it would be no problem if the arrow indicating the direction to go would not be aligned perfectly to the world and would disappear when it moved towards the periphery of the visual field. However, for a paintball game it would be annoying if the virtual container I'm hiding behind would be transparent or even invisible for other players. In the first paragraph I will start with an overview of the sources of optical distortion. Next, perceptual consequences of distortions will be discussed, followed by a list of requirements. Thereafter I wil propose some solutions in an attempt to fulfill the requirements, and then some feasability estimations of these solutions will be made. Conclusions will be drawn how the UbiCom first prototype might look like.

[1]  Richard Szeliski,et al.  Creating full view panoramic image mosaics and environment maps , 1997, SIGGRAPH.

[2]  Steven K. Feiner,et al.  Windows on the world: 2D windows for 3D augmented reality , 1993, UIST '93.

[3]  Larry F. Hodges,et al.  Comparison of 3-D display formats for CAD applications , 1991, Electronic Imaging.

[4]  Christopher D. Wickens,et al.  Three-dimensional stereoscopic display implementation: guidelines derived from human visual capabilities , 1990, Other Conferences.

[5]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[6]  Jitendra Malik,et al.  Image-based rendering: Really new or deja vu? , 1997, SIGGRAPH 1997.

[7]  Matthias M. Wloka Lag in Multiprocessor Virtual Reality , 1995, Presence: Teleoperators & Virtual Environments.

[8]  Bernard D. Adelstein,et al.  Improved temporal response in virtual environments through system hardware and software reorganization , 1996, Electronic Imaging.

[9]  Jay Torborg,et al.  Talisman: commodity realtime 3D graphics for the PC , 1996, SIGGRAPH.

[10]  Jitendra Malik,et al.  Modeling and Rendering Architecture from Photographs: A hybrid geometry- and image-based approach , 1996, SIGGRAPH.

[11]  Michael Bajura,et al.  Merging Virtual Objects with the Real World , 1992 .

[12]  Woodrow Barfield,et al.  Human Perception and Performance in 3D Virtual Environments , 1994 .

[13]  Thomas J. Smith,et al.  Behavioral Control Characteristics of Performance under Feedback Delay , 1994 .

[14]  R. E. Kalman,et al.  New Results in Linear Filtering and Prediction Theory , 1961 .

[15]  Michael J. Singer,et al.  Are Stereoscopic Displays Beneficial in Virtual Environments? , 1994 .

[16]  Takeo Kanade,et al.  CMU Video-Rate Stereo Machine , 1995 .

[17]  John G. Eyles,et al.  PixelFlow: high-speed rendering using image composition , 1992, SIGGRAPH.

[18]  Edward H. Spain,et al.  Stereoscopic versus orthogonal view displays for performance of a remote manipulation task , 1991, Electronic Imaging.

[19]  Mi-Suen Lee,et al.  Synthesizing Novel Views from Unregistered 2‐D Images , 1997, Comput. Graph. Forum.

[20]  Steve Mann,et al.  Wearable Computing: A First Step Toward Personal Imaging , 1997, Computer.

[21]  Richard L. Holloway,et al.  Registration Error Analysis for Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[22]  Harry L. Snyder,et al.  Comparison of depth cues for relative depth judgments , 1990, Other Conferences.

[23]  Uwe H List,et al.  Nonlinear Prediction of Head Movements for Helmet-Mounted Displays , 1983 .

[24]  John Snyder,et al.  Rendering with coherent layers , 1997, SIGGRAPH.

[25]  Robert E. Cole,et al.  Remote-manipulator tasks impossible without stereo TV , 1990, Other Conferences.

[26]  Ryutarou Ohbuchi,et al.  Merging virtual objects with the real world: seeing ultrasound imagery within the patient , 1992, SIGGRAPH.

[27]  Robert E. Clapp Stereoscopic Displays And The Human Dual Visual System , 1986, Photonics West - Lasers and Applications in Science and Engineering.

[28]  Ronald Azuma,et al.  A demonstrated optical tracker with scalable work area for head-mounted display systems , 1992, I3D '92.

[29]  Ken-ichi Anjyo,et al.  Tour into the picture: using a spidery mesh interface to make animation from a single image , 1997, SIGGRAPH.

[30]  Ronald Azuma,et al.  Improving static and dynamic registration in an optical see-through HMD , 1994, SIGGRAPH.

[31]  Pieter Padmos,et al.  Quality Criteria for Simulator Images: A Literature Review , 1992 .

[32]  Weijia Zhou,et al.  Proceedings of the Human Factors and Ergonomics Society , 1996 .

[33]  George J. Andersen,et al.  The use of occlusion to resolve ambiguity in parallel projections , 1982, Perception & psychophysics.

[34]  Marc Olano,et al.  Combatting rendering latency , 1995, I3D '95.

[35]  O. Sabouraud [Space perception]. , 1978, Revue d'oto-neuro-ophtalmologie.

[36]  Ronald Azuma,et al.  Tracking a head-mounted display in a room-sized environment with head-mounted cameras , 1990, Defense, Security, and Sensing.

[37]  Stephen R. Ellis,et al.  Judgments of the Distance to Nearby Virtual Objects: Interaction of Viewing Conditions and Accommodative Demand , 1997, Presence: Teleoperators & Virtual Environments.

[38]  Steven M. Seitz,et al.  View morphing , 1996, SIGGRAPH.

[39]  Eric Horvitz,et al.  Perception, Attention, and Resources: A Decision-Theoretic Approach to Graphics Rendering , 1997, UAI.

[40]  John S. Montrym,et al.  InfiniteReality: a real-time graphics system , 1997, SIGGRAPH.

[41]  Marc Levoy,et al.  Light field rendering , 1996, SIGGRAPH.

[42]  Leonard McMillan,et al.  Head-tracked stereoscopic display using image warping , 1995, Electronic Imaging.

[43]  Larry F. Hodges,et al.  The Perception of Distance in Simulated Visual Displays:A Comparison of the Effectiveness and Accuracy of Multiple Depth Cues Across Viewing Distances , 1997, Presence: Teleoperators & Virtual Environments.

[44]  Daniel Cohen-Or,et al.  Selective Pixel Transmission for Navigating in Remote Virtual Environments , 1997, Comput. Graph. Forum.

[45]  Wouter Pasman Enhancing x-ray baggage inspection by interactive viewpoint selection , 1997 .