Searchers adjust their eye-movement dynamics to target characteristics in natural scenes
暂无分享,去创建一个
[1] Jiri Najemnik,et al. Eye movement statistics in humans are consistent with an optimal search strategy. , 2008, Journal of vision.
[2] Michael L. Mack,et al. VISUAL SALIENCY DOES NOT ACCOUNT FOR EYE MOVEMENTS DURING VISUAL SEARCH IN REAL-WORLD SCENES , 2007 .
[3] Dennis M Levi,et al. What limits performance in the amblyopic visual system: seeing signals in noise with an amblyopic brain. , 2008, Journal of vision.
[4] J. Henderson,et al. Facilitation of return during scene viewing , 2009 .
[5] J. Antes. The time course of picture viewing. , 1974, Journal of experimental psychology.
[6] C. Cierpka,et al. Particle imaging techniques for volumetric three-component (3D3C) velocity measurements in microfluidics , 2011, Journal of Visualization.
[7] N. Mackworth,et al. Cognitive determinants of fixation location during picture viewing. , 1978, Journal of experimental psychology. Human perception and performance.
[8] Kerry Hourigan,et al. Wake transition of a rolling sphere , 2011, J. Vis..
[9] R. Hess,et al. On the decline of 1st and 2nd order sensitivity with eccentricity. , 2008, Journal of vision.
[10] Felix A. Wichmann,et al. Influence of initial fixation position in scene viewing , 2016, Vision Research.
[11] Zhaoyu Wei,et al. The plunging cavities formed by the impinged jet after the entry of a sphere into water , 2014, J. Vis..
[12] Masahiro Takei,et al. Human resource development and visualization , 2009, J. Vis..
[13] George L. Malcolm,et al. The effects of target template specificity on visual search in real-world scenes: evidence from eye movements. , 2009, Journal of vision.
[14] J. Rovamo,et al. An estimation and application of the human cortical magnification factor , 2004, Experimental Brain Research.
[15] Peter König,et al. Saccadic Momentum and Facilitation of Return Saccades Contribute to an Optimal Foraging Strategy , 2013, PLoS Comput. Biol..
[16] W. Geisler,et al. Separation of low-level and high-level factors in complex tasks: visual search. , 1995, Psychological review.
[17] J. Henderson,et al. The effects of semantic consistency on eye movements during complex scene viewing , 1999 .
[18] Gregory J. Zelinsky,et al. Scene context guides eye movements during visual search , 2006, Vision Research.
[19] Matthias Bethge,et al. DeepGaze II: Reading fixations from deep features trained on object recognition , 2016, ArXiv.
[20] Jonathan Baron,et al. Nonconsequentialist decisions. Commentaries. Author's reply , 1994 .
[21] Benjamin W. Tatler,et al. Systematic tendencies in scene viewing , 2008 .
[22] Simon Barthelmé,et al. Spatial statistics and attentional dynamics in scene viewing. , 2014, Journal of vision.
[23] Alex D. Hwang,et al. A model of top-down attentional control during visual search in complex scenes. , 2009, Journal of vision.
[24] B. Tatler,et al. The prominence of behavioural biases in eye guidance , 2009 .
[25] P. Subramanian. Active Vision: The Psychology of Looking and Seeing , 2006 .
[26] Ralf Engbert,et al. Spatial frequency processing in the central and peripheral visual field during scene viewing , 2015, Vision Research.
[27] J. Wolfe,et al. Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.
[28] J H van Hateren,et al. Simulating human cones from mid-mesopic up to high-photopic luminances. , 2007, Journal of vision.
[29] Mark Wexler,et al. The nonlinear structure of motion perception during smooth eye movements. , 2009, Journal of vision.
[30] Johan Hulleman,et al. The impending demise of the item in visual search , 2015, Behavioral and Brain Sciences.
[31] T. Meese,et al. The attenuation surface for contrast sensitivity has the form of a witch's hat within the central visual field. , 2012, Journal of vision.
[32] A. Nuthmann. How do the regions of the visual field contribute to object search in real-world scenes? Evidence from eye movements. , 2014, Journal of experimental psychology. Human perception and performance.
[33] Matthias Bethge,et al. Information-theoretic model comparison unifies saliency metrics , 2015, Proceedings of the National Academy of Sciences.
[34] Ralf Engbert,et al. ICAT: a computational model for the adaptive control of fixation durations , 2014, Psychonomic bulletin & review.
[35] Antonio Torralba,et al. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.
[36] Barton L Anderson,et al. Texture-shading flow interactions and perceived reflectance. , 2014, Journal of vision.
[37] J. Duncan,et al. Visual search and stimulus similarity. , 1989, Psychological review.
[38] Heiko H Schütt,et al. An image-computable psychophysical spatial vision model. , 2017, Journal of vision.
[39] Felix A. Wichmann,et al. Disentangling top-down vs. bottom-up and low-level vs. high-level influences on eye movements over time , 2018, 1803.07352.
[40] James R. Brockmole,et al. LATEST: A Model of Saccadic Decisions in Space and Time , 2017, Psychological review.
[41] Ralf Engbert,et al. Control of fixation duration during scene viewing by interaction of foveal and peripheral processing. , 2013, Journal of vision.
[42] Ralf Engbert,et al. Temporal evolution of the central fixation bias in scene viewing. , 2016, Journal of vision.
[43] A. Mizuno,et al. A change of the leading player in flow Visualization technique , 2006, J. Vis..
[44] J. Robson,et al. Application of fourier analysis to the visibility of gratings , 1968, The Journal of physiology.
[45] C Meinecke,et al. Retinal eccentricity and the detection of targets , 1989, Psychological research.
[46] Bernhard Schölkopf,et al. Center-surround patterns emerge as optimal predictors for human saccade targets. , 2009, Journal of vision.
[47] Graham L. Pierce,et al. Eye movements during scene viewing: Evidence for mixed control of fixation durations , 2008, Psychonomic bulletin & review.
[48] Ralf Engbert,et al. Microsaccades are triggered by low retinal image slip. , 2006, Proceedings of the National Academy of Sciences of the United States of America.
[49] U. Neisser. VISUAL SEARCH. , 1964, Scientific American.
[50] Antonio Torralba,et al. Modeling global scene factors in attention. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.
[51] R. Shapley,et al. Noise masking of White's illusion exposes the weakness of current spatial filtering models of lightness perception. , 2015, Journal of vision.
[52] Benjamin W Tatler,et al. The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions. , 2007, Journal of vision.
[53] Tim H. W. Cornelissen,et al. Stuck on semantics: Processing of irrelevant object-scene inconsistencies modulates ongoing gaze behavior , 2017, Attention, perception & psychophysics.
[54] C. Erkelens,et al. Coarse-to-fine eye movement strategy in visual search , 2007, Vision Research.
[55] George L. Malcolm,et al. How context information and target information guide the eyes from the first epoch of search in real-world scenes. , 2014, Journal of vision.
[56] Ralf Engbert,et al. Disentangling bottom-up versus top-down and low-level versus high-level influences on eye movements over time. , 2019, Journal of vision.
[57] J. Robson,et al. Probability summation and regional variation in contrast sensitivity across the visual field , 1981, Vision Research.
[58] A. Treisman,et al. A feature-integration theory of attention , 1980, Cognitive Psychology.
[59] Matthias Bethge,et al. DeepGaze II: Predicting fixations from deep features over time and tasks , 2017 .
[60] Rumi Tokunaga,et al. The modern Japanese color lexicon. , 2017, Journal of vision.
[61] Peter Dalgaard,et al. R Development Core Team (2010): R: A language and environment for statistical computing , 2010 .
[62] Jyrki Rovamo,et al. Contrast sensitivity as a function of spatial frequency, viewing distance and eccentricity with and without spatial noise , 1992, Vision Research.
[63] Heiko H. Schütt,et al. Likelihood-Based Parameter Estimation and Comparison of Dynamical Cognitive Models , 2016, Psychological review.
[64] Frédo Durand,et al. Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.
[65] K. Fujii,et al. Visualization for the analysis of fluid motion , 2005, J. Vis..
[66] Marcus Nyström,et al. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data , 2010, Behavior research methods.
[67] Ralf Engbert,et al. Microsaccades uncover the orientation of covert attention , 2003, Vision Research.
[68] Zhi Liu,et al. Saccadic model of eye movements for free-viewing condition , 2015, Vision Research.
[69] Wilson S. Geisler,et al. Optimal eye movement strategies in visual search , 2005, Nature.
[70] Matthias Bethge,et al. Deep Gaze I: Boosting Saliency Prediction with Feature Maps Trained on ImageNet , 2014, ICLR.
[71] Adrian Baddeley,et al. spatstat: An R Package for Analyzing Spatial Point Patterns , 2005 .
[72] Paul M Bays,et al. Active inhibition and memory promote exploration and search of natural scenes. , 2012, Journal of vision.
[73] R. F. Hess,et al. The contrast sensitivity gradient across the human visual field: With emphasis on the low spatial frequency range , 1989, Vision Research.
[74] Katsumi Aoki,et al. Recent development of flow visualization , 2004, J. Vis..