GazeHorizon: enabling passers-by to interact with public displays by gaze

Public displays can be made interactive by adding gaze control. However, gaze interfaces do not offer any physical affordance, and require users to move into a tracking range. We present GazeHorizon, a system that provides interactive assistance to enable passers-by to walk up to a display and to navigate content using their eyes only. The system was developed through field studies culminating in a four-day deployment in a public environment. Our results show that novice users can be facilitated to successfully use gaze control by making them aware of the interface at first glance and guiding them interactively into the tracking range.

[1]  Dominik Schmidt,et al.  Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.

[2]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[3]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[4]  Florian Alt,et al.  Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.

[5]  David S Wooding,et al.  Eye movements of large populations: I. Implementation and performance of an autonomous public eye tracker , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[6]  Yoichi Sato,et al.  Vision-Based Face Tracking System for Large Displays , 2002, UbiComp.

[7]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[8]  Yanxia Zhang,et al.  Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction , 2014, AVI.

[9]  Steven K. Feiner,et al.  Gaze locking: passive eye contact detection for human-object interaction , 2013, UIST.

[10]  Yvonne Rogers,et al.  Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface , 2011, CHI.

[11]  Alois Ferscha,et al.  Real-Time Gaze Tracking for Public Displays , 2010, AmI.

[12]  Albrecht Schmidt,et al.  Requirements and design space for interactive public displays , 2010, ACM Multimedia.

[13]  Nicolai Marquardt,et al.  Proxemic interactions: the new ubicomp? , 2011, INTR.

[14]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[15]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[16]  Yvonne Rogers,et al.  Enticing People to Interact with Large Public Displays in Public Spaces , 2003, INTERACT.

[17]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[18]  Roel Vertegaal,et al.  Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.

[19]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[20]  Jörg Müller,et al.  Screenfinity: extending the perception area of content on very large public displays , 2013, CHI.

[21]  Jorge Gonçalves,et al.  What makes you click: exploring visual signals to entice interaction on public displays , 2013, CHI.

[22]  Antti Oulasvirta,et al.  It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.

[23]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.