Eye tracking for public displays in the wild
暂无分享,去创建一个
[1] Dan Witzner Hansen,et al. Eye-based head gestures , 2012, ETRA.
[2] Jorge Gonçalves,et al. What makes you click: exploring visual signals to entice interaction on public displays , 2013, CHI.
[3] Steven K. Feiner,et al. Gaze locking: passive eye contact detection for human-object interaction , 2013, UIST.
[4] Alois Ferscha,et al. Real-Time Gaze Tracking for Public Displays , 2010, AmI.
[5] Jörg Müller,et al. GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.
[6] Florian Alt,et al. Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.
[7] Tamás D. Gedeon,et al. "Moving to the centre": A gaze-driven remote camera control for teleoperation , 2011, Interact. Comput..
[8] Yanxia Zhang,et al. SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.
[9] Jörg Müller,et al. Screenfinity: extending the perception area of content on very large public displays , 2013, CHI.
[10] Roel Vertegaal,et al. ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.
[11] Shumin Zhai,et al. Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.
[12] Yoichi Sato,et al. Vision-Based Face Tracking System for Large Displays , 2002, UbiComp.
[13] Hans-Werner Gellersen,et al. Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.
[14] Qiang Ji,et al. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[15] I. Scott MacKenzie,et al. Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.
[16] Jörg Müller,et al. StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.
[17] Dominik Schmidt,et al. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch , 2013, INTERACT.
[18] Antti Oulasvirta,et al. It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.
[19] Yvonne Rogers,et al. Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface , 2011, CHI.
[20] Albrecht Schmidt,et al. Requirements and design space for interactive public displays , 2010, ACM Multimedia.
[21] Daniel Vogel,et al. Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.
[22] Roel Vertegaal,et al. Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.
[23] Terry Winograd,et al. Gaze-enhanced scrolling techniques , 2007, UIST.
[24] Yvonne Rogers,et al. Enticing People to Interact with Large Public Displays in Public Spaces , 2003, INTERACT.
[25] Gang Ren,et al. 3D Freehand Gestural Navigation for Interactive Public Displays , 2013, IEEE Computer Graphics and Applications.
[26] Carlos Hitoshi Morimoto,et al. Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..
[27] Hans-Werner Gellersen,et al. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.
[28] Yanxia Zhang,et al. Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction , 2014, AVI.
[29] Andreas Butz,et al. Touch projector: mobile interaction through video , 2010, CHI.
[30] Steven K. Feiner,et al. My own private kiosk: privacy-preserving public displays , 2004, Eighth International Symposium on Wearable Computers.