Eye tracking for public displays in the wild

AbstractIn public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.

[1]  Dan Witzner Hansen,et al.  Eye-based head gestures , 2012, ETRA.

[2]  Jorge Gonçalves,et al.  What makes you click: exploring visual signals to entice interaction on public displays , 2013, CHI.

[3]  Steven K. Feiner,et al.  Gaze locking: passive eye contact detection for human-object interaction , 2013, UIST.

[4]  Alois Ferscha,et al.  Real-Time Gaze Tracking for Public Displays , 2010, AmI.

[5]  Jörg Müller,et al.  GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.

[6]  Florian Alt,et al.  Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.

[7]  Tamás D. Gedeon,et al.  "Moving to the centre": A gaze-driven remote camera control for teleoperation , 2011, Interact. Comput..

[8]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[9]  Jörg Müller,et al.  Screenfinity: extending the perception area of content on very large public displays , 2013, CHI.

[10]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[11]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[12]  Yoichi Sato,et al.  Vision-Based Face Tracking System for Large Displays , 2002, UbiComp.

[13]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[14]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  I. Scott MacKenzie,et al.  Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.

[16]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.

[17]  Dominik Schmidt,et al.  Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.

[18]  Antti Oulasvirta,et al.  It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.

[19]  Yvonne Rogers,et al.  Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface , 2011, CHI.

[20]  Albrecht Schmidt,et al.  Requirements and design space for interactive public displays , 2010, ACM Multimedia.

[21]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[22]  Roel Vertegaal,et al.  Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.

[23]  Terry Winograd,et al.  Gaze-enhanced scrolling techniques , 2007, UIST.

[24]  Yvonne Rogers,et al.  Enticing People to Interact with Large Public Displays in Public Spaces , 2003, INTERACT.

[25]  Gang Ren,et al.  3D Freehand Gestural Navigation for Interactive Public Displays , 2013, IEEE Computer Graphics and Applications.

[26]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[27]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[28]  Yanxia Zhang,et al.  Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction , 2014, AVI.

[29]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[30]  Steven K. Feiner,et al.  My own private kiosk: privacy-preserving public displays , 2004, Eighth International Symposium on Wearable Computers.