Detection thresholds for label motion in visually cluttered displays

While label placement algorithms are generally successful in managing visual clutter by preventing label overlap, they can also cause significant label movement in dynamic displays. This study investigates motion detection thresholds for various types of label movement in realistic and complex virtual environments, which can be helpful for designing less salient and disturbing algorithms. Our results show that label movement in stereoscopic depth is shown to be less noticeable than similar lateral monoscopic movement, inherent to 2D label placement algorithms. Furthermore, label movement can be introduced more readily into the visual periphery (over 15° eccentricity) because of reduced sensitivity in this region. Moreover, under the realistic viewing conditions that we used, motion of isolated labels is more easily detected than that of overlapping labels. This perhaps counterintuitive finding may be explained by visual masking due to the visual clutter arising from the label overlap. The quantitative description of the findings presented in this paper should be useful not only for label placement applications, but also for any cluttered AR or VR application in which designers wish to control the users' visual attention, either making text labels more or less noticeable as needed.

[1]  Joe Marks,et al.  The Computational Complexity of Cartographic Label Placement , 1991 .

[2]  B. Julesz Foundations of Cyclopean Perception , 1971 .

[3]  Ronald Azuma,et al.  Evaluating label placement for augmented reality view management , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[4]  D Regan,et al.  Accuracy of estimating time to collision using binocular and monocular information , 1998, Vision Research.

[5]  Robert J. Snowden,et al.  Textured backgrounds alter perceived speed , 2000, Vision Research.

[6]  M. Schwab,et al.  Neurite growth inhibitors restrict plasticity and functional recovery following corticospinal tract lesions , 1998, Nature Neuroscience.

[7]  Stephen R. Ellis,et al.  Evaluation of Alternative Label Placement Techniques in Dynamic Virtual Environments , 2009, Smart Graphics.

[8]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[9]  G. Westheimer,et al.  Effects of practice and the separation of test targets on foveal and peripheral stereoacuity , 1983, Vision Research.

[10]  C W Tyler,et al.  Stereoscopic Depth Movement: Two Eyes Less Sensitive than One , 1971, Science.

[11]  D. Simons,et al.  Moving and looming stimuli capture attention , 2003, Perception & psychophysics.

[12]  E. Imhof Positioning Names on Maps , 1975 .

[13]  Pinhas Yoeli,et al.  The Logic of Automated Map Lettering , 1972 .

[14]  Stephen R. Ellis,et al.  Visual clutter management in augmented reality: Effects of three label separation methods on spatial judgments , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[15]  Stephen R. Ellis,et al.  Objective and subjective assessment of stereoscopically separated labels in augmented reality , 2009, Comput. Graph..

[16]  Christian Wehrhahn,et al.  Motion perception in the peripheral visual field , 2004, Graefe's Archive for Clinical and Experimental Ophthalmology.

[17]  G. Mather,et al.  Perceived speed of motion in depth is reduced in the periphery , 2000, Vision Research.

[18]  Joe Marks,et al.  An empirical study of algorithms for point-feature label placement , 1995, TOGS.

[19]  L. O. Harvey,et al.  Detectability of relative motion as a function of exposure duration, angular separation, and background , 1974 .

[20]  C. Tyler,et al.  Frequency response characteristics for sinusoidal movement in the fovea and periphery , 1972 .