Serious gaze
暂无分享,去创建一个
[1] Marc Levoy,et al. Gaze-directed volume rendering , 1990, I3D '90.
[2] Norman I. Badler,et al. Eye Movements, Saccades, and Multiparty Conversations , 2008 .
[3] Robert J. K. Jacob,et al. What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.
[4] Shumin Zhai,et al. Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.
[5] Anthony M. Norcia,et al. Directional selectivity in the cortex , 1998 .
[6] R. Leigh,et al. Triggering mechanisms in microsaccade and saccade generation: a novel proposal , 2011, Annals of the New York Academy of Sciences.
[7] Dinesh K. Pai,et al. Eyecatch: simulating visuomotor coordination for object interception , 2012, ACM Trans. Graph..
[8] Veronica Sundstedt,et al. Gazing at games: using eye tracking to control virtual characters , 2010, SIGGRAPH '10.
[9] Donald H. House,et al. Online 3D Gaze Localization on Stereoscopic Displays , 2014, TAP.
[10] Donald H. House,et al. Measuring vergence over stereoscopic video with a remote eye tracker , 2010, ETRA.
[11] Krzysztof Krejtz,et al. Visualizing Dynamic Ambient/Focal Attention with Coefficient $$K$$ , 2015, ETVIS.
[12] Dave Roberts,et al. Eye gaze in virtual environments: evaluating the need and initial work on implementation , 2009, Concurr. Comput. Pract. Exp..
[13] Mel Slater,et al. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment , 2003, CHI '03.
[14] Desney S. Tan,et al. Foveated 3D graphics , 2012, ACM Trans. Graph..
[15] Roel Vertegaal,et al. The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.
[16] D. Noton,et al. Eye movements and visual perception. , 1971, Scientific American.
[17] Andrew T. Duchowski. Hardware-accelerated real-time simulation of arbitrary visual fields , 2004, ETRA.
[18] Manuel Menezes de Oliveira Neto,et al. Photorealistic models for pupil light reflex and iridal pattern deformation , 2009, TOGS.
[19] Pia Rotshtein,et al. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze , 2016, PloS one.
[20] M A Just,et al. A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.
[21] Pernilla Qvarfordt,et al. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications , 2016, ETRA.
[22] Hans-Peter Seidel,et al. GazeStereo3D: seamless disparity manipulations , 2016, ACM Trans. Graph..
[23] Benjamin Watson,et al. Perceptually Driven Simplification Using Gaze-Directed Rendering , 2012 .
[24] Brent Lance,et al. A model of gaze for the purpose of emotional expression in virtual embodied agents , 2008, AAMAS.
[25] Soraia Raupp Musse,et al. Providing expressive gaze to virtual animated characters in interactive applications , 2008, CIE.
[26] Ali Borji,et al. State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[27] L. Stark,et al. The main sequence, a tool for studying human eye movements , 1975 .
[28] C. Mead,et al. Neuromorphic Robot Vision with Mixed Analog- Digital Architecture , 2005 .
[29] LAWRENCE STARK,et al. Pupil Unrest: An Example of Noise in a Biological Servomechanism , 1958, Nature.
[30] M. Just,et al. Eye fixations and cognitive processes , 1976, Cognitive Psychology.
[31] Stephen D. Landy. Mapping the Universe. , 1999 .
[32] Dmitry Sokolov,et al. On Smooth 3D Frame Field Design , 2015, ArXiv.
[33] Deborah J. Aks,et al. Memory Across Eye-Movements: 1/f Dynamic in Visual Search , 2010 .
[34] Ralf Engbert,et al. Computational Modeling of Collicular Integration of Perceptual Responses and Attention in Microsaccades , 2012, The Journal of Neuroscience.
[35] R. Baloh,et al. Quantitative measurement of saccade amplitude, duration, and velocity , 1975, Neurology.
[36] Li-Yi Wei,et al. Point sampling with general noise spectrum , 2012, ACM Trans. Graph..
[37] J. Harte,et al. Self-similarity and clustering in the spatial distribution of species. , 2000, Science.
[38] T. C. Nicholas Graham,et al. Use of eye movements for video game control , 2006, ACE '06.
[39] Vsevolod Peysakhovich. Study of pupil diameter and eye movements to enhance flight safety. Etude de diamètre pupillaire et de mouvements oculaires pour la sécurité aérienne , 2016 .
[40] Anand K. Gramopadhye,et al. Gaze-augmented think-aloud as an aid to learning , 2012, CHI.
[41] H. Collewijn,et al. Binocular co‐ordination of human horizontal saccadic eye movements. , 1988, The Journal of physiology.
[42] Heloir,et al. The Uncanny Valley , 2019, The Animation Studies Reader.
[43] Cai Hong-bin. Real-time depth-of-field simulation on GPU , 2008 .
[44] David M. Hoffman,et al. The zone of comfort: Predicting visual discomfort with stereo displays. , 2011, Journal of vision.
[45] Norman I. Badler,et al. Look me in the Eyes: A Survey of Eye and Gaze Animation for Virtual Agents and Artificial Systems , 2014, Eurographics.
[46] Lester C. Loschky,et al. How late can you update gaze-contingent multiresolutional displays without detection? , 2007, TOMCCAP.
[47] Mark Mon-Williams,et al. Natural problems for stereoscopic depth perception in virtual environments , 1995, Vision Research.
[48] Hans-Peter Seidel,et al. Saccade landing position prediction for gaze-contingent rendering , 2017, ACM Trans. Graph..
[49] Andrew T. Duchowski,et al. Gaze Transition Entropy , 2015, TAP.
[50] Donald H. House,et al. Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field , 2014, SAP.
[51] Heng Zhou,et al. Perceptual evaluation of synthetic gaze jitter , 2018, Comput. Animat. Virtual Worlds.
[52] Joohwan Kim,et al. Towards foveated rendering for gaze-tracked virtual reality , 2016, ACM Trans. Graph..
[53] Thomas Martinetz,et al. Gaze-contingent temporal filtering of video , 2006, ETRA '06.
[54] Richard A. Bolt,et al. A gaze-responsive self-disclosing display , 1990, CHI '90.
[55] John R. Wilson,et al. Effects of participating in virtual environmentsa review of current knowledge , 1996 .
[56] J. Anliker,et al. Eye movements - On-line measurement, analysis, and control , 1976 .
[57] Anand K. Gramopadhye,et al. 3D eye movement analysis for VR visual inspection training , 2002, ETRA.
[58] Norman I. Badler,et al. Evaluating perceived trust from procedurally animated gaze , 2013, MIG.
[59] Norman I. Badler,et al. Eyes alive , 2002, ACM Trans. Graph..
[60] Hans-Peter Seidel,et al. Modeling and optimizing eye vergence response to stereoscopic cuts , 2014, ACM Trans. Graph..
[61] Derek Bradley,et al. High-quality capture of eyes , 2014, ACM Trans. Graph..
[62] Koji Kashihara,et al. Emotional attention modulates microsaccadic rate and direction , 2014, Psychological research.
[63] P. Szendrő,et al. Pink-noise behaviour of biosystems , 2001, European Biophysics Journal.
[64] Iain Matthews,et al. Modeling and animating eye blinks , 2011, TAP.
[65] Andrew T. Duchowski,et al. A rotary dial for gaze-based PIN entry , 2016, ETRA.
[66] Usher,et al. Dynamic pattern formation leads to 1/f noise in neural populations. , 1995, Physical review letters.
[67] S K Rushton,et al. Developing visual systems and exposure to virtual reality and stereo displays: some concerns and speculations about the demands on accommodation and vergence. , 1999, Applied ergonomics.
[68] B. Tatler,et al. Yarbus, eye movements, and vision , 2010, i-Perception.
[69] Ian P. Howard,et al. Seeing in Depth , 2008 .
[70] Andrew T. Duchowski,et al. Perceptual gaze extent & level of detail in VR: looking outside the box , 2002, SIGGRAPH '02.
[71] Radoslaw Mantiuk,et al. Gaze-Dependent Depth-of-Field Effect Rendering in Virtual Environments , 2011, SGDA.
[72] Douglas Lanman,et al. Focal surface displays , 2017, ACM Trans. Graph..
[73] Qi Zhao,et al. Noise Characterization, Modeling, and Reduction for In Vivo Neural Recording , 2009, NIPS.
[74] R. Pritchard. Stabilized images on the retina. , 1961, Scientific American.
[75] Lester C. Loschky,et al. User performance with gaze contingent multiresolutional displays , 2000, ETRA.
[76] A. J. Van Opstal,et al. Skewness of saccadic velocity profiles: A unifying parameter for normal and slow saccades , 1987, Vision Research.
[77] Tsuneto Iwasaki,et al. The tolerance range of binocular disparity on a 3D display based on the physiological characteristics of ocular accommodation , 2009, Displays.
[78] Arzu Çöltekin,et al. Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging , 2007, TOMCCAP.
[79] Andrew T. Duchowski,et al. Eye movement synthesis , 2016, ETRA.