Increasing the effective egocentric field of view with proprioceptive and tactile feedback

Multimodality often exhibits synergistic effects: each modality compliments and compensates for other modalities in transferring coherent, unambiguous, and enriched information for higher interaction efficiency and improved sense of presence. In this paper, we explore one such phenomenon: a positive interaction among the geometric field of view, proprioceptive interaction, and tactile feedback. We hypothesize that, with proprioceptive interaction and tactile feedback, the geometric field of view and thus visibility can be increased such that it is larger than the physical field of view, without causing a significant distortion in the user's distance perception. This, in turn, would further help operation of the overall multimodal interaction scheme as the user is more likely to receive the multimodal feedback simultaneously. We tested our hypothesis with an experiment to measure the user's change in distance perception according to different values of egocentric geometric field of view and feedback conditions. Our experimental results have shown that, when coupled with physical interaction, the GFOV could be increased by up to 170 percent of the physical field of view without introducing significant distortion in distance perception. Second, when tactile feedback was introduced, in addition to visual and proprioceptive cues, the GFOV could be increased by up to 200 percent. The results offer a useful guideline for effectively utilizing of modality compensation and building multimodal interfaces for close range spatial tasks in virtual environments. In addition, it demonstrates one way to overcome the shortcomings of the narrow (physical) fields of views of most contemporary HMDs.

[1]  S. Shimojo,et al.  Sensory modalities are not separate modalities: plasticity and interactions , 2001, Current Opinion in Neurobiology.

[2]  Alexander H. Waibel,et al.  Multimodal interfaces , 1996, Artificial Intelligence Review.

[3]  David S. Ebert,et al.  The integrality of speech in multimodal interfaces , 1998, TCHI.

[4]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[5]  Yongseok Jang,et al.  Designing a Vibro-Tactile Wear for "Close Range" Interaction for VR-based Motion Training , 2002 .

[6]  Warren Robinett,et al.  The Visual Display Transformation for Virtual Reality , 1995, Presence: Teleoperators & Virtual Environments.

[7]  Woodrow Barfield,et al.  A Conceptual Model of the Sense of Presence in Virtual Environments , 1999, Presence: Teleoperators & Virtual Environments.

[8]  Martin Buss,et al.  A systems theoretical model for human perception in multimodal presence systems , 2001 .

[9]  Peter J. Werkhoven,et al.  Visuomotor Adaptation to Virtual Hand Position in Interactive Virtual Environments , 1998, Presence.

[10]  Ronald Azuma,et al.  A survey of augmented reality" Presence: Teleoperators and virtual environments , 1997 .

[11]  Neff Walker,et al.  Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[12]  Mel Slater,et al.  Taking steps: the influence of a walking technique on presence in virtual reality , 1995, TCHI.

[13]  Peter R. Jones,et al.  Implementation and Evaluation , 1995 .

[14]  Philip R. Cohen,et al.  A map-based system using speech and 3D gestures for pervasive computing , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[15]  James Park,et al.  The Brain's Sense of Movement , 2003, The Yale Journal of Biology and Medicine.

[16]  Takami Yamaguchi,et al.  Open sesame from top of your head-an event related potential based interface for the control of the virtual reality system , 1993, Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication.

[17]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[18]  William Ribarsky,et al.  Simulator sickness and presence in a high FOV virtual environment , 2001, Proceedings IEEE Virtual Reality 2001.

[19]  Mel Slater,et al.  A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments , 1997, Presence: Teleoperators & Virtual Environments.

[20]  Jonathan D. Pfautz,et al.  Depth Perception in Computer Graphics , 2000 .

[21]  C D Wickens,et al.  Compatibility and Resource Competition between Modalities of Input, Central Processing, and Output , 1983, Human factors.

[22]  Joseph Psotka,et al.  Effects of Field of View on Judgments of Self-Location: Distortions in Distance Estimations Even When the Image Geometry Exactly Fits the Field of View , 1998, Presence.

[23]  Frank Biocca,et al.  Virtual Eyes Can Rearrange Your Body: Adaptation to Visual Displacement in See-Through, Head-Mounted Displays , 1998, Presence.

[24]  V. Ramachandran,et al.  Phantoms in the Brain: Probing the Mysteries of the Human Mind , 1998 .

[25]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[26]  Sandra Blakeslee,et al.  Phantoms in the Brain , 1998 .

[27]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[28]  David Waller,et al.  Factors Affecting the Perception of Interobject Distances in Virtual Environments , 1999, Presence.

[29]  Joseph J. LaViola,et al.  A discussion of cybersickness in virtual environments , 2000, SGCH.

[30]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[31]  Gerard Jounghyun Kim,et al.  Implementation and Evaluation of Just Follow Me: An Immersive, VR-Based, Motion-Training System , 2002, Presence: Teleoperators & Virtual Environments.

[32]  Henry Fuchs,et al.  Dynamic virtual convergence for video see-through head-mounted displays: maintaining maximum stereo overlap throughout a close-range work space , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.