Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design
暂无分享,去创建一个
Meredith Ringel Morris | Ann Paradiso | Shaun K. Kane | Anna Maria Feit | Shane Williams | Arturo Toledo | Harish Kulkarni | M. Morris | A. Paradiso | A. Feit | Harish Kulkarni | Shane Williams | Arturo Toledo | Harish S. Kulkarni
[1] Päivi Majaranta,et al. Design issues of iDICT: a gaze-assisted translation aid , 2000, ETRA.
[2] Diako Mardanbegi,et al. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements , 2016, CHI.
[3] Giulio Jacucci,et al. Pointing while Looking Elsewhere: Designing for Varying Degrees of Visual Guidance during Manual Input , 2016, CHI.
[4] Joseph H. Goldberg,et al. Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.
[5] Andreas Bulling,et al. EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.
[6] Fong-Gong Wu,et al. Visual and manual loadings with QWERTY-like ambiguous keyboards: Relevance of letter-key assignments on mobile phones , 2015 .
[7] Francisco B. Rodríguez,et al. Gliding and saccadic gaze gesture recognition in real time , 2012, TIIS.
[8] Andreas Bulling,et al. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers , 2015 .
[9] Håkon Raudsandmoen,et al. Empirically Based Design Guidelines for Gaze Interaction in Windows 7 , 2012 .
[10] Howell O. Istance,et al. Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.
[11] Dave M. Stampe,et al. Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .
[12] Andrew T. Duchowski,et al. Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.
[13] Andreas Paepcke,et al. EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.
[14] Oleg Spakov. Comparison of eye movement filters used in HCI , 2012, ETRA '12.
[15] Jacob O. Wobbrock,et al. Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.
[16] Andreas Paepcke,et al. Improving the accuracy of gaze input for interaction , 2008, ETRA.
[17] Howell O. Istance,et al. Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.
[18] Kristien Ooms,et al. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental set- ups , 2015 .
[19] Richard H. R. Hahnloser,et al. Eye-Trace: Segmentation of Volumetric Microscopy Images with Eyegaze , 2016, CHI.
[20] Hans-Werner Gellersen,et al. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.
[21] Tal Garfinkel,et al. Reducing shoulder-surfing by using gaze-based password entry , 2007, SOUPS '07.
[22] Diego Gutierrez,et al. Gaze-based Interaction for Virtual Environments , 2008, J. Univers. Comput. Sci..
[23] Albrecht Schmidt,et al. Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.
[24] Darren Gergle,et al. Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration , 2016, CHI.
[25] Alan Kennedy,et al. Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.
[26] Margrit Betke,et al. EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.
[27] Bibianna Bałaj,et al. Paivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, John Paulin Hansen, Aulikki Hyrskykari, Kari-Jouko Raiha (red.), Gaze interaction and applications of eye tracking: Advances in assistive technologies, Hershey, PA: IGI Global 2012, ss. 382 (Recenzja) , 2012 .
[28] Oleg Spakov,et al. Fast gaze typing with an adjustable dwell time , 2009, CHI.
[29] Päivi Majaranta,et al. Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .
[30] Marcus Nyström,et al. Improving the Accuracy of Video-Based Eye-Tracking in Real-Time through Post-Calibration Regression , 2014 .
[31] Feng Liu,et al. Gaze-based Notetaking for Learning from Lecture Videos , 2016, CHI.
[32] Päivi Majaranta,et al. Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .
[33] Shumin Zhai,et al. Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.
[34] Marcus Nyström,et al. The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.
[35] Scott E. Hudson,et al. A framework for robust and flexible handling of inputs with uncertainty , 2010, UIST.
[36] Oleg Spakov,et al. PursuitAdjuster: an exploration into the design space of smooth pursuit --based widgets , 2016, ETRA.
[37] Roel Vertegaal,et al. EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.
[38] John L. Sibert,et al. The reading assistant: eye gaze triggered auditory prompting for reading remediation , 2000, UIST '00.
[39] Gerhard Rigoll,et al. SPOCK: A Smooth Pursuit Oculomotor Control Kit , 2016, CHI Extended Abstracts.
[40] Nicolas Roussel,et al. 1 € filter: a simple speed-based low-pass filter for noisy input in interactive systems , 2012, CHI.
[41] Andreas Bulling,et al. Prediction of gaze estimation error for error-aware gaze-based interfaces , 2016, ETRA.
[42] Andrew T. Duchowski,et al. Gaze-Contingent Displays: A Review , 2004, Cyberpsychology Behav. Soc. Netw..
[43] Miguel A. Nacenta,et al. Depth perception with gaze-contingent depth of field , 2014, CHI.