SideWays: a gaze interface for spontaneous interaction with situated displays

Eye gaze is compelling for interaction with situated displays as we naturally use our eyes to engage with them. In this work we present SideWays, a novel person-independent eye gaze interface that supports spontaneous interaction with displays: users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training. Requiring only a single off-the-shelf camera and lightweight image processing, SideWays robustly detects whether users attend to the centre of the display or cast glances to the left or right. The system supports an interaction model in which attention to the central display is the default state, while "sidelong glances" trigger input or actions. The robustness of the system and usability of the interaction model are validated in a study with 14 participants. Analysis of the participants' strategies in performing different tasks provides insights on gaze control strategies for design of SideWays applications.

[1]  Arnold W. M. Smeulders,et al.  Color Invariance , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Steven K. Feiner,et al.  My own private kiosk: privacy-preserving public displays , 2004, Eighth International Symposium on Wearable Computers.

[3]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[4]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Terry Winograd,et al.  Gaze-enhanced scrolling techniques , 2007, UIST.

[6]  Yoichi Sato,et al.  Calibration-free gaze sensing using saliency maps , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Omar Mubin,et al.  How Not to Become a Buffoon in Front of a Shop Window: A Solution Allowing Natural Head Movement for Interaction with a Public Display , 2009, INTERACT.

[8]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[9]  John Paulin Hansen,et al.  Gaze-based interaction with public displays using off-the-shelf components , 2010, UbiComp '10 Adjunct.

[10]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[11]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[12]  Florian Alt,et al.  Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.

[13]  Antti Oulasvirta,et al.  It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.

[14]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[15]  Roel Vertegaal,et al.  Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.

[16]  Yanxia Zhang,et al.  Towards pervasive eye tracking using low-level image features , 2012, ETRA '12.

[17]  Rafael Cabeza,et al.  Evaluation of pupil center-eye corner vector for gaze estimation using a web cam , 2012, ETRA '12.

[18]  Theo Gevers,et al.  Accurate eye center location and tracking using isophote curvature , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  Zhiwei Zhu,et al.  Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.

[20]  Yoichi Sato,et al.  Vision-Based Face Tracking System for Large Displays , 2002, UbiComp.

[21]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[22]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[23]  Zhiwei Zhu,et al.  Eye gaze tracking under natural head movements , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[24]  Alois Ferscha,et al.  Real-Time Gaze Tracking for Public Displays , 2010, AmI.

[25]  Margrit Betke,et al.  A Human–Computer Interface Using Symmetry Between Eyes to Detect Gaze Direction , 2008, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[26]  Yvonne Rogers,et al.  Enticing People to Interact with Large Public Displays in Public Spaces , 2003, INTERACT.

[27]  Aude Billard,et al.  Calibration-Free Eye Gaze Direction Detection with Gaussian Processes , 2008, VISAPP.

[28]  Jeffrey S. Shell,et al.  Designing for augmented attention: Towards a framework for attentive user interfaces , 2006, Comput. Hum. Behav..