Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays

Current augmented reality displays still have a very limited field of view compared to the human vision. In order to localize out-of-view objects, researchers have predominantly explored visual guidance approaches to visualize information in the limited (in-view) screen space. Unfortunately, visual conflicts like cluttering or occlusion of information often arise, which can lead to search performance issues and a decreased awareness about the physical environment. In this paper, we compare an innovative non-visual guidance approach based on audio-tactile cues with the state-of-the-art visual guidance technique EyeSee360 for localizing out-of-view objects in augmented reality displays with limited field of view. In our user study, we evaluate both guidance methods in terms of search performance and situation awareness. We show that although audio-tactile guidance is generally slower than the well-performing EyeSee360 in terms of search times, it is on a par regarding the hit rate. Even more so, the audio-tactile method provides a significant improvement in situation awareness compared to the visual approach.

[1]  Gerald Matthews,et al.  Self-Report Arousal and Divided Attention: A Study of Performance Operating Characteristics , 1991 .

[2]  G. Michel,et al.  Restricting the Field of View: Perceptual and Performance Effects , 1990, Perceptual and motor skills.

[3]  Ernst Kruijff,et al.  Non-Visual Cues for View Management in Narrow Field of View Augmented Reality Displays , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[4]  Ali Israr,et al.  Tactile brush: drawing on skin with a tactile grid display , 2011, CHI.

[5]  Jan Theeuwes,et al.  Competition between auditory and visual spatial cues during visual task performance , 2009, Experimental Brain Research.

[6]  Wolfgang Stuerzlinger,et al.  Is the Pen Mightier than the Controller? A Comparison of Input Devices for Selection in Virtual and Augmented Reality , 2019, VRST.

[7]  Antti Oulasvirta,et al.  Dynamic tactile guidance for visual search tasks , 2012, UIST '12.

[8]  Zhengyou Zhang,et al.  Auditory augmented reality: Object sonification for the visually impaired , 2012, 2012 IEEE 14th International Workshop on Multimedia Signal Processing (MMSP).

[9]  Kiyoshi Kiyokawa,et al.  The Influence of Label Design on Search Performance and Noticeability in Wide Field of View Augmented Reality Displays , 2019, IEEE Transactions on Visualization and Computer Graphics.

[10]  Roberto Bresin,et al.  A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities , 2013, PloS one.

[11]  Wilko Heuten,et al.  Improving Search Time Performance for Locating Out-of-View Objects in Augmented Reality , 2019, Mensch & Computer.

[12]  Maxwell J. Wells,et al.  The Effect Of Field-Of-View Size On Performance At A Simple Simulated Air-To-Air Mission , 1989, Defense, Security, and Sensing.

[13]  Gerd Bruder,et al.  Analysis of Proximity-Based Multimodal Feedback for 3D Selection in Immersive Virtual Environments , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[14]  Mark Dredze,et al.  Pokémon GO-A New Distraction for Drivers and Pedestrians. , 2016, JAMA internal medicine.

[15]  Eric Kolstad,et al.  Egocentric depth judgments in optical, see-through augmented reality , 2007, IEEE Transactions on Visualization and Computer Graphics.

[16]  John P. McIntire,et al.  Visual Search Performance With 3-D Auditory Cues: Effects of Motion, Target Location, and Practice , 2010, Hum. Factors.

[17]  William S. Helton,et al.  Visual Cues to Reorient Attention from Head Mounted Displays , 2016 .

[18]  Wilko Heuten,et al.  Locating nearby physical objects in augmented reality , 2019, MUM.

[19]  Michael Beigl,et al.  ProximityHat: a head-worn system for subtle sensory augmentation with tactile stimulation , 2015, SEMWEB.

[20]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[21]  Anil K. Raj,et al.  Multimodal and Multisensory Displays for Perceptual Tasks , 2015 .

[22]  Byoung-Jun Park,et al.  Augmented reality based on driving situation awareness in vehicle , 2015, 2015 17th International Conference on Advanced Communication Technology (ICACT).

[23]  Ross T. Smith,et al.  Cognitive Cost of Using Augmented Reality Displays , 2017, IEEE Transactions on Visualization and Computer Graphics.

[24]  Tae-Young Lee,et al.  Supporting Driver Situation Awareness for Autonomous Urban Driving with an Augmented-Reality Windshield Display , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[25]  Javier Irizarry,et al.  InfoSPOT: A mobile Augmented Reality method for accessing building information through a situation awareness approach , 2013 .

[26]  Mihran Tuceryan,et al.  Automatic determination of text readability over textured backgrounds for augmented reality systems , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[27]  Neville Stanton,et al.  Situation awareness measurement: a review of applicability for C4i environments. , 2006, Applied ergonomics.

[28]  Robert W. Lindeman,et al.  Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system , 2004, VRST '04.

[29]  T. Schnell,et al.  Terrain awareness & pathway guidance for head-up displays (tapguide); a simulator study of pilot performance , 2003, Digital Avionics Systems Conference, 2003. DASC '03. The 22nd.

[30]  Rolf Zon,et al.  Eye Movements as an Indicator of Situation Awareness in a Flight Simulator Experiment , 2012 .

[31]  Johannes Schöning,et al.  Tactile hand motion and pose guidance for 3D interaction , 2018, VRST.

[32]  Thies Pfeiffer,et al.  Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[33]  C. Spence,et al.  Multisensory Integration: Maintaining the Perception of Synchrony , 2003, Current Biology.

[34]  Weisi Lin,et al.  Selective Visual Attention: Computational Models and Applications , 2013 .

[35]  Katherine J. Kuchenbecker,et al.  Effects of Vibrotactile Feedback on Human Learning of Arm Motions , 2015, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[36]  D. Damos Multiple-task performance , 2020 .

[37]  Wilko Heuten,et al.  EyeSeeX: Visualization of Out-of-View Objects on Small Field-of-View Augmented and Virtual Reality Devices , 2018, PerDis.

[38]  Steven K. Feiner,et al.  The Effect of Narrow Field of View and Information Density on Visual Search Performance in Augmented Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[39]  Susanne Boll,et al.  FlyingARrow: Pointing Towards Out-of-View Objects on Augmented Reality Devices , 2018, PerDis.

[40]  Wilko Heuten,et al.  Comparing Techniques for Visualizing Moving Out-of-View Objects in Head-mounted Virtual Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[41]  L. Nolan,et al.  Biological psychology , 2019, An Introduction to the Psychology of Humor.

[42]  Z. J. Lipowski,et al.  Sensory and information inputs overload: behavioral effects. , 1975, Comprehensive psychiatry.

[43]  Peter A. Hancock,et al.  Field of View Effects on Pilot Performance in Flight , 2010 .

[44]  Zijiang J. He,et al.  Perceiving distance accurately by a directional process of integrating ground information , 2004, Nature.

[45]  Daniel J. Garland,et al.  Situation Awareness Analysis and Measurement , 2009 .

[46]  Steven K. Feiner,et al.  Perceptual issues in augmented reality revisited , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[47]  Tobias Höllerer,et al.  Simulation of Augmented Reality Systems in Purely Virtual Environments , 2009, 2009 IEEE Virtual Reality Conference.

[48]  Tobias Höllerer,et al.  Evaluating wide-field-of-view augmented reality with mixed reality simulation , 2016, 2016 IEEE Virtual Reality (VR).

[49]  Robert W. Lindeman,et al.  Effective Vibrotactile Cueing in a Visual Search Task , 2003, INTERACT.

[50]  Michael Rohs,et al.  HapticHead: 3D Guidance and Target Acquisition through a Vibrotactile Grid , 2016, CHI Extended Abstracts.

[51]  Deborah Hix,et al.  The Effects of Text Drawing Styles, Background Textures, and Natural Lighting on Text Legibility in Outdoor Augmented Reality , 2006, Presence: Teleoperators & Virtual Environments.

[52]  Niels Henze,et al.  Influence of subliminal cueing on visual search tasks , 2013, CHI Extended Abstracts.

[53]  Charles Spence Crossmodal attention , 1998, Scholarpedia.

[54]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[55]  Mariusz Chmielewski,et al.  Application of Augmented Reality, Mobile Devices, and Sensors for a Combat Entity Quantitative Assessment Supporting Decisions and Situational Awareness Development , 2019 .

[56]  C. Spence,et al.  Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli , 2010, Attention, perception & psychophysics.

[57]  Pourang Irani,et al.  Moving Ahead with Peephole Pointing: Modelling Object Selection with Head-Worn Display Field of View Limitations , 2016, SUI.

[58]  Anderson Maciel,et al.  Designing a Vibrotactile Head-Mounted Display for Spatial Awareness in 3D Spaces , 2017, IEEE Transactions on Visualization and Computer Graphics.

[59]  Walter D. Potter,et al.  Vibrotactile Glove guidance for semi-autonomous wheelchair operations , 2008, ACM-SE 46.

[60]  Steven K. Feiner,et al.  Directing attention and influencing memory with visual saliency modulation , 2011, CHI.

[61]  Gudrun Klinker,et al.  Supporting order picking with Augmented Reality , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[62]  Ernst Kruijff,et al.  FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments , 2020, CHI.

[63]  Gregory K. Tharp,et al.  Visual search in virtual environments , 1992, Electronic Imaging.

[64]  Kiyoshi Kiyokawa,et al.  Analysing the effects of a wide field of view augmented reality display on search performance in divided attention tasks , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[65]  Michel Denis,et al.  NAVIG: augmented reality guidance system for the visually impaired , 2012, Virtual Reality.

[66]  Kangsoo Kim,et al.  Revisiting Trends in Augmented Reality Research: A Review of the 2nd Decade of ISMAR (2008–2017) , 2018, IEEE Transactions on Visualization and Computer Graphics.

[67]  R. Klatzky,et al.  - Sensory Substitution of Vision: Importance of Perceptual and Cognitive Processing , 2018, Assistive Technology for Blindness and Low Vision.

[68]  Pirkko Oittinen,et al.  Stereoscopic depth perception in video see-through augmented reality within action space , 2014, J. Electronic Imaging.

[69]  Charles Spence,et al.  Does crossmodal correspondence modulate the facilitatory effect of auditory cues on visual search? , 2012, Attention, Perception, & Psychophysics.

[70]  Masatoshi Ishikawa,et al.  Augmenting spatial awareness with Haptic Radar , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[71]  Ronald Azuma,et al.  Evaluating label placement for augmented reality view management , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[72]  Dieter Schmalstieg,et al.  Image-driven view management for augmented reality browsers , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[73]  Jan Theeuwes,et al.  Pip and pop: nonspatial auditory signals improve spatial visual search. , 2008, Journal of experimental psychology. Human perception and performance.

[74]  Carl Gutwin,et al.  Peripheral Popout: The Influence of Visual Angle and Stimulus Intensity on Popout Effects , 2017, CHI.

[75]  Heinrich Hußmann,et al.  Guidance in Cinematic Virtual Reality-Taxonomy, Research Status and Challenges , 2019, Multimodal Technol. Interact..

[76]  Stephen R. Ellis,et al.  Label segregation by remapping stereoscopic depth in far-field augmented reality , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[77]  Nassir Navab,et al.  Towards Efficient Visual Guidance in Limited Field-of-View Head-Mounted Displays , 2018, IEEE Transactions on Visualization and Computer Graphics.

[78]  Shachar Maidenbaum,et al.  Author's Personal Copy Neuroscience and Biobehavioral Reviews Sensory Substitution: Closing the Gap between Basic Research and Widespread Practical Visual Rehabilitation Author's Personal Copy , 2022 .

[79]  E. J. Mccormick,et al.  The use of auditory cues in a visual search task. , 1960 .

[80]  Henry Fuchs,et al.  FocusAR: Auto-focus Augmented Reality Eyeglasses for both Real World and Virtual Imagery , 2018, IEEE Transactions on Visualization and Computer Graphics.

[81]  Richard L. Newman,et al.  Head-Up Displays: Designing the Way Ahead , 1995 .

[82]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[83]  Richard Kronland-Martinet,et al.  Comparison and Evaluation of Sonification Strategies for Guidance Tasks , 2016, IEEE Transactions on Multimedia.

[84]  Stephen R. Ellis,et al.  Localization of Virtual Objects in the Near Visual Field , 1998, Hum. Factors.

[85]  Anthony D. Andre,et al.  Situation Awareness in an Augmented Reality Cockpit: Design, Viewpoints and Cognitive Glue , 2005 .

[86]  Tim Claudius Stratmann,et al.  Ensuring Safety in Augmented Reality from Trade-off Between Immersion and Situation Awareness , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[87]  Stephen R. Ellis,et al.  Managing Visual Clutter: A Generalized Technique for Label Segregation using Stereoscopic Disparity , 2008, 2008 IEEE Virtual Reality Conference.

[88]  Thies Pfeiffer,et al.  Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views , 2018, COGAIN@ETRA.

[89]  Susanne Boll,et al.  EyeSee360: designing a visualization technique for out-of-view objects in head-mounted augmented reality , 2017, SUI.

[90]  J. Edward Swan,et al.  Military Applications of Augmented Reality , 2011, Handbook of Augmented Reality.