Hidden in Plain Sight: an Exploration of a Visual Language for Near-Eye Out-of-Focus Displays in the Peripheral View

In this paper, we set out to find what encompasses an appropriate visual language for information presented on near-eye out-of-focus displays. These displays are positioned in a user's peripheral view, very near to the user's eyes, for example on the inside of the temples of a pair of glasses. We explored the usable display area, the role of spatial and retinal variables, and the influence of motion and interaction for such a language. Our findings show that a usable visual language can be accomplished by limiting the possible shapes and by making clever use of orientation and meaningful motion. We found that especially motion is very important to improve perception and comprehension of what is being displayed on near-eye out-of-focus displays, and that perception is further improved if direct interaction with the content is allowed.

[1]  Mohammed E. Hoque,et al.  Rhema: A Real-Time In-Situ Intelligent Interface to Help People with Public Speaking , 2015, IUI.

[2]  Colin Ware,et al.  Information Visualization: Perception for Design , 2000 .

[3]  Tara Matthews,et al.  Designing and evaluating glanceable peripheral displays , 2006, DIS '06.

[4]  R KESTON,et al.  Visual Experiments Related to Night Carrier Landing , 1964, Human factors.

[5]  Pattie Maes,et al.  Perifoveal display: combining foveal and peripheral vision in one visualization , 2012, UbiComp '12.

[6]  D. S. Green,et al.  Instrument Displays for Blind Flying , 1964 .

[7]  Olsson Measuring Driver Visual Distraction with a Peripheral Detection Task , 2000 .

[8]  Paul P. Maglio,et al.  Tradeoffs in displaying peripheral information , 2000, CHI.

[9]  I. Biederman Recognition-by-components: a theory of human image understanding. , 1987, Psychological review.

[10]  Peter Eades,et al.  An intrusive evaluation of peripheral display , 2005, GRAPHITE '05.

[11]  Blair MacIntyre,et al.  Support for multitasking and background awareness using interactive peripheral displays , 2001, UIST '01.

[12]  Anoop Gupta,et al.  Designing and deploying an information awareness interface , 2002, CSCW '02.

[13]  Tomas Sokoler,et al.  AROMA: abstract representation of presence supporting mutual awareness , 1997, CHI.

[14]  S M Anstis,et al.  Letter: A chart demonstrating variations in acuity with retinal position. , 1974, Vision research.

[15]  Tara Matthews,et al.  A toolkit for managing user attention in peripheral displays , 2004, UIST '04.

[16]  Johannes Schöning,et al.  Augmenting Social Interactions: Realtime Behavioural Feedback using Social Signal Processing Techniques , 2015, CHI.

[17]  John J. Bertin,et al.  The semiology of graphics , 1983 .

[18]  S. Liversedge,et al.  Oxford handbook of eye movements , 2011 .

[19]  Tara Matthews,et al.  A Peripheral Display Toolkit , 2003 .

[20]  Pattie Maes,et al.  eye-q: eyeglass peripheral display for subtle intimate notifications , 2006, Mobile HCI.

[21]  Anind K. Dey,et al.  Heuristic evaluation of ambient displays , 2003, CHI '03.

[22]  Allen Newell,et al.  The model human processor: An engineering model of human performance. , 1986 .