The Effect of Task on Visual Attention in Interactive Virtual Environments
暂无分享,去创建一个
Eamonn O'Neill | Michael J. Proulx | Jacob Hadnett-Hunter | George Nicolaou | M. Proulx | E. O'Neill | Jacob Hadnett-Hunter | George Nicolaou
[1] Gunnar Blohm,et al. Differential influence of attention on gaze and head movements. , 2009, Journal of neurophysiology.
[2] Michael D. Dodd,et al. Examining the influence of task set on eye movements and fixations. , 2011, Journal of vision.
[3] Sungkil Lee,et al. Real-Time Tracking of Visually Attended Objects in Virtual Environments and Its Application to LOD , 2009, IEEE Transactions on Visualization and Computer Graphics.
[4] Taylor R. Hayes,et al. Meaning-based guidance of attention in scenes as revealed by meaning maps , 2017, Nature Human Behaviour.
[5] A. Treisman,et al. A feature-integration theory of attention , 1980, Cognitive Psychology.
[6] D. Ballard,et al. Eye guidance in natural vision: reinterpreting salience. , 2011, Journal of vision.
[7] A. L. Yarbus. Eye Movements During Perception of Complex Objects , 1967 .
[8] Mary M Hayhoe,et al. Task and context determine where you look. , 2016, Journal of vision.
[9] O. Meur,et al. Predicting visual fixations on video based on low-level visual features , 2007, Vision Research.
[10] S Ullman,et al. Shifts in selective visual attention: towards the underlying neural circuitry. , 1985, Human neurobiology.
[11] Kenny Mitchell,et al. User, metric, and computational evaluation of foveated rendering methods , 2016, SAP.
[12] M. Proulx. Bottom-up guidance in visual search for conjunctions. , 2007, Journal of experimental psychology. Human perception and performance.
[13] R. Baloh,et al. Quantitative measurement of saccade amplitude, duration, and velocity , 1975, Neurology.
[14] Laurent Itti,et al. Applying computational tools to predict gaze direction in interactive visual environments , 2008, TAP.
[15] R. Venkatesh Babu,et al. DeepFix: A Fully Convolutional Neural Network for Predicting Human Eye Fixations , 2015, IEEE Transactions on Image Processing.
[16] L. Itti,et al. Quantifying center bias of observers in free viewing of dynamic natural scenes. , 2009, Journal of vision.
[17] John M. Henderson,et al. Predicting Cognitive State from Eye Movements , 2013, PloS one.
[18] Regan L. Mandryk,et al. Biofeedback game design: using direct and indirect physiological control to enhance game interaction , 2011, CHI.
[19] Asha Iyer,et al. Components of bottom-up gaze allocation in natural images , 2005, Vision Research.
[20] Christof Koch,et al. Learning a saliency map using fixated locations in natural scenes. , 2011, Journal of vision.
[21] D. Ballard,et al. Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.
[22] H. Deubel,et al. Saccade target selection and object recognition: Evidence for a common attentional mechanism , 1996, Vision Research.
[23] Jeremy M. Wolfe,et al. Guided Search 4.0: Current Progress With a Model of Visual Search , 2007, Integrated Models of Cognitive Systems.
[24] Nuno Vasconcelos,et al. On the plausibility of the discriminant center-surround hypothesis for visual saliency. , 2008, Journal of vision.
[25] James J. Clark,et al. An inverse Yarbus process: Predicting observers’ task from eye movement patterns , 2014, Vision Research.
[26] Jon Driver,et al. Visual search for a conjunction of movement and form is parallel , 1988, Nature.
[27] L. Itti,et al. Defending Yarbus: eye movements reveal observers' task. , 2014, Journal of vision.
[28] Frédo Durand,et al. What Do Different Evaluation Metrics Tell Us About Saliency Models? , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[29] G. Barnes,et al. Independent control of head and gaze movements during head‐free pursuit in humans , 1999, The Journal of physiology.
[30] Arti R. Bhore. Reading Users' Minds from Their Eyes: A Method for Implicit Image Annotation , 2013 .
[31] Ricardo Matos,et al. Identifying parameter values for an I-VT fixation filter suitable for handling data sampled with various sampling frequencies , 2012, ETRA.
[32] Karen E. Adolph,et al. Visually guided navigation: Head-mounted eye-tracking of natural locomotion in children and adults , 2010, Vision Research.
[33] Roger W Remington,et al. Modulation of spatial attention by goals, statistical learning, and monetary reward , 2015, Attention, perception & psychophysics.
[34] Heiner Deubel,et al. Please Scroll down for Article International Journal of Neuroscience Eye-movements during Navigation in a Virtual Tunnel Eye-movements during Navigation in a Virtual Tunnel , 2022 .
[35] Michael J. Spivey,et al. Linguistically Mediated Visual Search , 2001, Psychological science.
[36] Derrick J. Parkhurst,et al. Modeling the role of salience in the allocation of overt visual attention , 2002, Vision Research.
[37] Anatole Lécuyer,et al. Design and Application of Real-Time Visual Attention Model for the Exploration of 3D Virtual Environments , 2012, IEEE Transactions on Visualization and Computer Graphics.
[38] Neil D. B. Bruce,et al. Predicting task from eye movements: On the importance of spatial distribution, dynamics, and image features , 2016, Neurocomputing.
[39] A. O'Toole,et al. On the preattentive accessibility of stereoscopic disparity: Evidence from visual search , 1997, Perception & psychophysics.
[40] A. Oliva,et al. Segmentation of objects from backgrounds in visual search tasks , 2002, Vision Research.
[41] Anatole Lécuyer,et al. Gaze behavior and visual attention model when turning in virtual environments , 2009, VRST '09.
[42] J. Wolfe,et al. What attributes guide the deployment of visual attention and how do they do it? , 2004, Nature Reviews Neuroscience.
[43] M. Hayhoe,et al. In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.
[44] Philipp Slusallek,et al. Foveated Real‐Time Ray Tracing for Head‐Mounted Displays , 2016, Comput. Graph. Forum.
[45] E. Freedman. Coordination of the eyes and head during visual orienting , 2008, Experimental Brain Research.
[46] Michael L. Mack,et al. Viewing task influences eye movement control during active scene perception. , 2009, Journal of vision.
[47] Frédo Durand,et al. A Benchmark of Computational Models of Saliency to Predict Human Fixations , 2012 .
[48] Athanasios V. Vasilakos,et al. Dynamic Intelligent Lighting for Directing Visual Attention in Interactive 3-D Scenes , 2009, IEEE Transactions on Computational Intelligence and AI in Games.
[49] Michael J Proulx,et al. Does apparent size capture attention in visual search? Evidence from the Muller-Lyer illusion. , 2011, Journal of vision.
[50] R. Baddeley,et al. The long and the short of it: Spatial statistics at fixation vary with saccade amplitude and task , 2006, Vision Research.
[51] Christof Koch,et al. A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .
[52] Laurent Itti,et al. Real-time high-performance attention focusing in outdoors color video streams , 2002, IS&T/SPIE Electronic Imaging.
[53] Ashley M. Sherman,et al. Visual search for arbitrary objects in real scenes , 2011, Attention, perception & psychophysics.
[54] J. Wolfe,et al. Five factors that guide attention in visual search , 2017, Nature Human Behaviour.
[55] Pietro Perona,et al. Graph-Based Visual Saliency , 2006, NIPS.
[56] Antoine Coutrot,et al. Scanpath modeling and classification with hidden Markov models , 2017, Behavior Research Methods.
[57] Yuhong Jiang,et al. Setting up the target template in visual search. , 2005, Journal of vision.
[58] Michael L. Mack,et al. VISUAL SALIENCY DOES NOT ACCOUNT FOR EYE MOVEMENTS DURING VISUAL SEARCH IN REAL-WORLD SCENES , 2007 .
[59] Chek Tien Tan,et al. Personalised gaming: a motivation and overview of literature , 2012, IE '12.
[60] Rita Cucchiara,et al. A deep multi-level network for saliency prediction , 2016, 2016 23rd International Conference on Pattern Recognition (ICPR).