Boundary conditions for information visualization with respect to the user's gaze

Gaze tracking in Augmented Reality is mainly used to trigger buttons and access information. Such selectable objects are usually placed in the world or in screen coordinates of a head- or hand-mounted display. Yet, no work has investigated options to place information with respect to the line of sight. This work presents our first steps towards gaze-mounted information visualization and interaction, determining boundary conditions for such an approach. We propose a general concept for information presentation at an angular offset to the line of sight. A user can look around freely, yet having information attached nearby the line of sight. Whenever the user wants to look at the information and does so, the information is placed directly at the axis of sight for a short time. Based on this concept we investigate how users understand frames of reference, specifically, if users relate directions and alignments in head or world coordinates. We further investigate if information may have a preferred motion behavior. Prototypical implementations of three variants are presented to users in guided interviews. The three variants resemble a rigid offset and two different floating motion behaviors of the information. Floating algorithms implement an inertia based model and either allow the user's gaze to surpass the information or to push information with the gaze. Testing our proto-types yielded findings that users strongly prefer information maintaining world-relation and that less extra motion is preferred.

[1]  Steven K. Feiner,et al.  Windows on the world: 2D windows for 3D augmented reality , 1993, UIST '93.

[2]  Gudrun Klinker,et al.  A System Architecture for Ubiquitous Tracking Environments , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[3]  Jong-Soo Choi,et al.  Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[4]  Gudrun Klinker,et al.  Spatial relationship patterns: elements of reusable tracking and calibration systems , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[5]  Jun Rekimoto,et al.  Peripheral vision annotation: noninterference information presentation method for mobile augmented reality , 2011, AH '11.

[6]  Peter Fröhlich,et al.  KIBITZER: a wearable system for eye-gaze-based mobile urban exploration , 2010, AH.

[7]  Lijun Yin,et al.  Viewing direction estimation based on 3D eyeball construction for HRI , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[8]  Dieter Schmalstieg,et al.  Collaborative Augmented Reality for Outdoor Navigation and Information Browsing , 2003 .

[9]  Steve Benford,et al.  A Spatial Model of Interaction in Large Virtual Environments , 1993, ECSCW.

[10]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[11]  Blair MacIntyre,et al.  Browsing the Real-World Wide Web: Maintaining Awareness of Virtual Information in an AR Information Space , 2003, Int. J. Hum. Comput. Interact..

[12]  Susanna Nilsson,et al.  Hands Free Interaction with Virtual Information in a Real Environment , 2007 .

[13]  Gudrun Klinker,et al.  Representing information - Classifying the Augmented Reality presentation space , 2013, Comput. Graph..

[14]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[15]  Dave M. Stampe,et al.  Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .

[16]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[17]  Christian Sandor,et al.  An AR workbench for experimenting with attentive user interfaces , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[18]  Masahiro Samejima,et al.  Eye mark pointer in immersive projection display , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).

[19]  Georg Gartner,et al.  Location Based Services and TeleCartography , 2007, Location Based Services and TeleCartography.