KIBITZER: a wearable system for eye-gaze-based mobile urban exploration

Due to the vast amount of available georeferenced information novel techniques to more intuitively and efficiently interact with such content are increasingly required. In this paper, we introduce KIBITZER, a lightweight wearable system that enables the browsing of urban surroundings for annotated digital information. KIBITZER exploits its user's eye-gaze as natural indicator of attention to identify objects-of-interest and offers speech- and non-speech auditory feedback. Thus, it provides the user with a 6th sense for digital georeferenced information. We present a description of our system's architecture and the interaction technique and outline experiences from first functional trials.

[1]  Blair MacIntyre,et al.  Browsing the Real-World Wide Web: Maintaining Awareness of Virtual Information in an AR Information Space , 2003, Int. J. Hum. Comput. Interact..

[2]  Jong-Soo Choi,et al.  Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[3]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[4]  Francis K. H. Quek Eyes in the interface , 1995, Image Vis. Comput..

[5]  Mitsuru Ishizuka,et al.  Cascading Hand and Eye Movement for Augmented Reality Videoconferencing , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[6]  Dieter Schmalstieg,et al.  Collaborative Augmented Reality for Outdoor Navigation and Information Browsing , 2003 .

[7]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[8]  Dieter Schmalstieg,et al.  The World as a User Interface: Augmented Reality for Ubiquitous Computing , 2007, Location Based Services and TeleCartography.

[9]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[10]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[11]  Peter Fröhlich,et al.  A mobile application framework for the geospatial web , 2007, WWW '07.

[12]  Paul M. Fitts,et al.  Eye movements of aircraft pilots during instrument-landing approaches. , 1950 .

[13]  Roel Vertegaal,et al.  Designing attentive interfaces , 2002, ETRA.

[14]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[15]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[16]  Peter Fröhlich,et al.  Mobile Spatial Interaction , 2007, Personal and Ubiquitous Computing.

[17]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.