Attributes of Subtle Cues for Facilitating Visual Search in Augmented Reality

Goal-oriented visual search is performed when a person intentionally seeks a target in the visual environment. In augmented reality (AR) environments, visual search can be facilitated by augmenting virtual cues in the person's field of view. Traditional use of explicit AR cues can potentially degrade visual search performance due to the creation of distortions in the scene. An alternative to explicit cueing, known as subtle cueing, has been proposed as a clutter-neutral method to enhance visual search in video-see-through AR. However, the effects of subtle cueing are still not well understood, and more research is required to determine the optimal methods of applying subtle cueing in AR. We performed two experiments to investigate the variables of scene clutter, subtle cue opacity, size, and shape on visual search performance. We introduce a novel method of experimentally manipulating the scene clutter variable in a natural scene while controlling for other variables. The findings provide supporting evidence for the subtlety of the cue, and show that the clutter conditions of the scene can be used both as a global classifier, as well as a local performance measure.

[1]  D. Purves,et al.  Why we see what we do redux : a wholly empirical theory of vision , 2011 .

[2]  Eric Bergman,et al.  Introduction to Human Factors , 2012, Journal of diabetes science and technology.

[3]  Yuanzhen Li,et al.  Measuring visual clutter. , 2007, Journal of vision.

[4]  Henrik I. Christensen,et al.  Computational visual attention systems and their cognitive foundations: A survey , 2010, TAP.

[5]  Jane H. Barrow,et al.  Basic Perception in Head-Worn Augmented Reality Displays , 2013 .

[6]  Stephen DiVerdi,et al.  Annotation in outdoor augmented reality , 2009, Comput. Graph..

[7]  Steven K. Feiner,et al.  Focus and Context in Mixed Reality by Modulating First Order Salient Features , 2010, Smart Graphics.

[8]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[9]  Stephen R. Ellis,et al.  Label segregation by remapping stereoscopic depth in far-field augmented reality , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[10]  Marino Menozzi,et al.  Applying the Ishihara test to a PC-based screening system , 1999 .

[11]  Frank Biocca,et al.  Attention funnel: omnidirectional 3D cursor for mobile augmented reality platforms , 2006, CHI.

[12]  Ann McNamara,et al.  Subtle gaze direction , 2009, TOGS.

[13]  Ronald A. Rensink The Management of Visual Attention in Graphic Displays the Management of Visual Attention in Graphic Displays 0. Visual Perception , 2022 .

[14]  Frank Biocca,et al.  Attention Issues in Spatial Information Systems: Directing Mobile Users' Visual Attention Using Augmented Reality , 2007, J. Manag. Inf. Syst..

[15]  U. Neisser VISUAL SEARCH. , 1964, Scientific American.

[16]  Hany Farid,et al.  Search for a Category Target in Clutter , 2004, Perception.

[17]  Henry Been-Lirn Duh,et al.  Subtle cueing for visual search in augmented reality , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[18]  I ChristensenHenrik,et al.  Computational visual attention systems and their cognitive foundations , 2010, TAP 2010.

[19]  Steven K. Feiner,et al.  Directing attention and influencing memory with visual saliency modulation , 2011, CHI.

[20]  Gudrun Klinker,et al.  Supporting order picking with Augmented Reality , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[21]  Albert A. Michelson,et al.  Studies in Optics , 1995 .

[22]  Laurent Itti,et al.  Models of Bottom-up Attention and Saliency , 2005 .

[23]  Yu-Tung Liu,et al.  Some phenomena of seeing shapes in design , 1995 .

[24]  Mark Billinghurst,et al.  An evaluation of wearable information spaces , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[25]  A. Oliva,et al.  Segmentation of objects from backgrounds in visual search tasks , 2002, Vision Research.

[26]  J. Wolfe,et al.  What attributes guide the deployment of visual attention and how do they do it? , 2004, Nature Reviews Neuroscience.

[27]  Leonardo Bonanni,et al.  Attention-based design of augmented reality interfaces , 2005, CHI Extended Abstracts.

[28]  Frédo Durand,et al.  De-emphasis of distracting image regions using texture power maps , 2005, APGV '05.

[29]  Stephen R. Ellis,et al.  Visual clutter management in augmented reality: Effects of three label separation methods on spatial judgments , 2009, 2009 IEEE Symposium on 3D User Interfaces.