A Design Space for Gaze Interaction on Head-mounted Displays
暂无分享,去创建一个
Andreas Bulling | Enrico Rukzio | Jan Gugenheimer | Teresa Hirzle | Florian Geiselhart | A. Bulling | E. Rukzio | Teresa Hirzle | Jan Gugenheimer | Florian Geiselhart
[1] Kang Ryoung Park,et al. 3D gaze tracking method using Purkinje images on eye optical model and pupil , 2012 .
[2] Mtm Marc Lambooij,et al. Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review , 2009 .
[3] Gordon Wetzstein,et al. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays , 2017, Proceedings of the National Academy of Sciences.
[4] Allen G. Taylor,et al. What Is the Microsoft HoloLens , 2016 .
[5] Sheng-Wen Shih,et al. A novel approach to 3-D gaze tracking using stereo cameras , 2004, IEEE Trans. Syst. Man Cybern. Part B.
[6] Nicolas Roussel,et al. 1 € filter: a simple speed-based low-pass filter for noisy input in interactive systems , 2012, CHI.
[7] Kent Lyons,et al. Looking at or through?: using eye tracking to infer attention location for wearable transparent displays , 2014, SEMWEB.
[8] Hans-Werner Gellersen,et al. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.
[9] Philipp Slusallek,et al. Predicting the gaze depth in head-mounted displays using multiple feature regression , 2018, ETRA.
[10] Michael Rohs,et al. The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.
[11] P. Milgram,et al. A Taxonomy of Mixed Reality Visual Displays , 1994 .
[12] Henry Been-Lirn Duh,et al. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
[13] Pierre-Yves Laffont,et al. Verifocal: a platform for vision correction and accommodation in head-mounted displays , 2018, SIGGRAPH Emerging Technologies.
[14] Enrico Rukzio,et al. EyeVR: low-cost VR eye-based interaction , 2016, UbiComp Adjunct.
[15] F. Zwicky,et al. The Morphological Approach to Discovery, Invention, Research and Construction , 1967 .
[16] Gordon Wetzstein,et al. Focus 3D: Compressive accommodation display , 2013, TOGS.
[17] Jörg Müller,et al. GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.
[18] Lucas Paletta,et al. Smartphone eye tracking toolbox: accurate gaze recovery on mobile displays , 2014 .
[19] Stephan Reichelt,et al. Depth cues in human visual perception and their realization in 3D displays , 2010, Defense + Commercial Sensing.
[20] Thies Pfeiffer. Measuring and visualizing attention in space with 3D attention volumes , 2012, ETRA '12.
[21] Lucas Paletta,et al. 3D recovery of human gaze in natural environments , 2013, Electronic Imaging.
[22] Donald H. House,et al. Comparing estimated gaze depth in virtual and physical environments , 2014, ETRA.
[23] Florian Alt,et al. GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices , 2016, CHI Extended Abstracts.
[24] David M. Hoffman,et al. The zone of comfort: Predicting visual discomfort with stereo displays. , 2011, Journal of vision.
[25] Paul Milgram,et al. Perceptual issues in augmented reality , 1996, Electronic Imaging.
[26] Hans-Werner Gellersen,et al. Gaze + pinch interaction in virtual reality , 2017, SUI.
[27] Pushkar Shukla,et al. 3D gaze estimation in the scene volume with a head-mounted eye tracker , 2018, COGAIN@ETRA.
[28] Jeff B. Pelz,et al. 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker , 2008, ETRA '08.
[29] James D. Foley,et al. The human factors of computer graphics interaction techniques , 1984, IEEE Computer Graphics and Applications.
[30] Thies Pfeiffer,et al. EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology , 2014, ETRA.
[31] Mark Billinghurst,et al. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.
[32] Jock D. Mackinlay,et al. The design space of input devices , 1990, CHI '90.
[33] Joohwan Kim,et al. Towards foveated rendering for gaze-tracked virtual reality , 2016, ACM Trans. Graph..
[34] Andreas Bulling,et al. On the Verge: Voluntary Convergences for Accurate and Precise Timing of Gaze Input , 2016, CHI Extended Abstracts.
[35] Gordon Wetzstein,et al. Accommodation-invariant computational near-eye displays , 2017, ACM Trans. Graph..
[36] Campbell Fw. A method for measuring the depth of field of the human eye. , 1954 .
[37] Jörg Müller,et al. Eye tracking for public displays in the wild , 2015, Personal and Ubiquitous Computing.
[38] Rafael Ballagas,et al. The Design Space of 3D Printable Interactivity , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[39] Andreas Bulling,et al. Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction , 2018, UIST.
[40] Jock D. Mackinlay,et al. A morphological analysis of the design space of input devices , 1991, TOIS.
[41] F. Toates,et al. Accommodation function of the human eye. , 1972, Physiological reviews.
[42] Ronald Azuma,et al. A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.
[43] Arindam Dey,et al. Estimating Gaze Depth Using Multi-Layer Perceptron , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).
[44] Yusuke Sugano,et al. 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers , 2016, ETRA.
[45] George Drettakis,et al. Accommodation and Comfort in Head-Mounted Displays , 2018 .
[46] Fumio Kishino,et al. Augmented reality: a class of displays on the reality-virtuality continuum , 1995, Other Conferences.
[47] Enkelejda Kasneci,et al. 3D Gaze Estimation using Eye Vergence , 2016, HEALTHINF.
[48] Panos Markopoulos,et al. The design space of shape-changing interfaces: a repertory grid study , 2014, Conference on Designing Interactive Systems.
[49] Vangelis Metsis,et al. Low-cost head position tracking for gaze point estimation , 2012, PETRA '12.
[50] Ivan E. Sutherland,et al. A head-mounted three dimensional display , 1968, AFIPS Fall Joint Computing Conference.
[51] Hans-Werner Gellersen,et al. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.
[52] Donald H. House,et al. Online 3D Gaze Localization on Stereoscopic Displays , 2014, TAP.
[53] John Vince,et al. Introduction to Virtual Reality , 2004, Springer London.
[54] David M. Hoffman,et al. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. , 2008, Journal of vision.
[55] T. Overton. 1972 , 1972, Parables of Sun Light.
[56] Andreas Bulling,et al. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.
[57] David Zeltzer,et al. Autonomy, Interaction, and Presence , 1992, Presence: Teleoperators & Virtual Environments.
[58] Donald H. House,et al. Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field , 2014, SAP.
[59] Jürgen Beyerer,et al. Real-time 3D gaze analysis in mobile applications , 2013, ETSA '13.