Computational Model for Predicting Visual Fixations from Childhood to Adulthood

How people look at visual information reveals fundamental information about themselves, their interests and their state of mind. While previous visual attention models output static 2-dimensional saliency maps, saccadic models aim to predict not only where observers look at but also how they move their eyes to explore the scene. Here we demonstrate that saccadic models are a flexible framework that can be tailored to emulate observer's viewing tendencies. More specifically, we use the eye data from 101 observers split in 5 age groups (adults, 8-10 y.o., 6-8 y.o., 4-6 y.o. and 2 y.o.) to train our saccadic model for different stages of the development of the human visual system. We show that the joint distribution of saccade amplitude and orientation is a visual signature specific to each age group, and can be used to generate age-dependent scanpaths. Our age-dependent saccadic model not only outputs human-like, age-specific visual scanpath, but also significantly outperforms other state-of-the-art saliency models. In this paper, we demonstrate that the computational modelling of visual attention, through the use of saccadic model, can be efficiently adapted to emulate the gaze behavior of a specific group of observers.

[1]  Sebastian Pannasch,et al.  The maturation of eye movement behavior: Scene viewing characteristics in children and adults , 2014, Vision Research.

[2]  L. Itti,et al.  High-throughput classification of clinical populations from natural viewing eye movements , 2012, Journal of Neurology.

[3]  R. Duncan Luce,et al.  Individual Choice Behavior: A Theoretical Analysis , 1979 .

[4]  R. Nisbett The geography of thought : how Asians and Westerners think differently--and why , 2003 .

[5]  O. Meur,et al.  Introducing context-dependent and spatially-variant viewing biases in saccadic models , 2016, Vision Research.

[6]  D. S. Wooding,et al.  Fixation maps: quantifying eye-movement traces , 2002, ETRA.

[7]  Thierry Baccino,et al.  Methods for comparing scanpaths and saliency maps: strengths and weaknesses , 2012, Behavior Research Methods.

[8]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[9]  Antoine Coutrot,et al.  How saccadic models help predict where we look during a visual task? Application to visual quality assessment , 2016, IQSP.

[10]  Ali Borji,et al.  Quantitative Analysis of Human-Model Agreement in Visual Saliency Modeling: A Comparative Study , 2013, IEEE Transactions on Image Processing.

[11]  John K. Tsotsos,et al.  On computational modeling of visual saliency: Examining what’s right, and what’s left , 2015, Vision Research.

[12]  L. Itti New Eye-Tracking Techniques May Revolutionize Mental Health Screening , 2015, Neuron.

[13]  J. Peacock Two-dimensional goodness-of-fit testing in astronomy , 1983 .

[14]  Yuan Yao,et al.  Simulating human saccadic scanpaths on natural images , 2011, CVPR 2011.

[15]  Tom Foulsham,et al.  Turning the world around: Patterns in saccade direction vary with picture orientation , 2008, Vision Research.

[16]  Frédo Durand,et al.  A Benchmark of Computational Models of Saliency to Predict Human Fixations , 2012 .

[17]  Asha Iyer,et al.  Components of bottom-up gaze allocation in natural images , 2005, Vision Research.

[18]  Ali Borji,et al.  Analysis of Scores, Datasets, and Models in Visual Saliency Prediction , 2013, 2013 IEEE International Conference on Computer Vision.

[19]  Stephen Lin,et al.  Semantically-Based Human Scanpath Estimation with HMMs , 2013, 2013 IEEE International Conference on Computer Vision.

[20]  Wilson S. Geisler,et al.  Simple summation rule for optimal fixation selection in visual search , 2009, Vision Research.

[21]  Laurent Itti,et al.  Visual attention guided bit allocation in video compression , 2011, Image Vis. Comput..

[22]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[23]  Giuseppe Boccignone,et al.  Modelling gaze shift as a constrained random walk , 2004 .

[24]  Joshua B. Tenenbaum,et al.  Multistability and Perceptual Inference , 2012, Neural Computation.

[25]  I Visser,et al.  Do infants have the horizontal bias? , 2016, Infant behavior & development.

[26]  K. Stanovich,et al.  Is probability matching smart? Associations between probabilistic choices and cognitive ability , 2003, Memory & cognition.

[27]  B. Tatler,et al.  The prominence of behavioural biases in eye guidance , 2009 .

[28]  B. Velichkovsky,et al.  Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration , 2005 .

[29]  Nicolas Riche,et al.  RARE2012: A multi-scale rarity-based saliency detection with its comparative statistical analysis , 2013, Signal Process. Image Commun..

[30]  Kurt Debattista,et al.  A GPU based saliency map for high-fidelity selective rendering , 2006, AFRIGRAPH '06.

[31]  Olga Sorkine-Hornung,et al.  A comparative study of image retargeting , 2010, ACM Trans. Graph..

[32]  Patrick Le Callet,et al.  Does where you Gaze on an Image Affect your Perception of Quality? Applying Visual Attention to Image Quality Metric , 2007, 2007 IEEE International Conference on Image Processing.

[33]  Benjamin W. Tatler,et al.  Systematic tendencies in scene viewing , 2008 .

[34]  Simon Barthelmé,et al.  Spatial statistics and attentional dynamics in scene viewing. , 2014, Journal of vision.

[35]  W. Einhäuser,et al.  Effects of aging on eye movements in the real world , 2015, Front. Hum. Neurosci..

[36]  Daniel P. Kennedy,et al.  Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking , 2015, Neuron.

[37]  Dirk P. Kroese,et al.  Kernel density estimation via diffusion , 2010, 1011.2602.

[38]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[39]  P. König,et al.  Developmental Changes in Natural Viewing Behavior: Bottom-Up and Top-Down Differences between Children, Young Adults and Older Adults , 2010, Front. Psychology.

[40]  P. Viviani,et al.  The curvature of oblique saccades , 1977, Vision Research.

[41]  K. Velanova,et al.  Development of eye-movement control , 2008, Brain and Cognition.

[42]  Theo Geisel,et al.  The ecology of gaze shifts , 2000, Neurocomputing.

[43]  Ulrik R. Beierholm,et al.  Probability Matching as a Computational Strategy Used in Perception , 2010, PLoS Comput. Biol..

[44]  Ali Borji,et al.  State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[45]  Zhi Liu,et al.  Saccadic model of eye movements for free-viewing condition , 2015, Vision Research.

[46]  Zhi Liu,et al.  Saliency Aggregation: Does Unity Make Strength? , 2014, ACCV.

[47]  Antoine Coutrot,et al.  Face exploration dynamics differentiate men and women. , 2016, Journal of vision.

[48]  Pietro Perona,et al.  Graph-Based Visual Saliency , 2006, NIPS.

[49]  Samia Nefti-Meziani,et al.  Predicting the Valence of a Scene from Observers’ Eye Movements , 2015, PloS one.

[50]  John M. Henderson,et al.  Predicting Cognitive State from Eye Movements , 2013, PloS one.

[51]  Esa Rahtu,et al.  Stochastic bottom-up fixation prediction and saccade generation , 2013, Image Vis. Comput..

[52]  W. Gaissmaier,et al.  The smart potential behind probability matching , 2008, Cognition.

[53]  Ingrid Heynderickx,et al.  Visual Attention in Objective Image Quality Assessment: Based on Eye-Tracking Data , 2011, IEEE Transactions on Circuits and Systems for Video Technology.

[54]  H Maeda,et al.  Gender differences and reproducibility in exploratory eye movements of normal subjects , 2000, Psychiatry and clinical neurosciences.

[55]  Eva Aring,et al.  Visual fixation development in children , 2007, Graefe's Archive for Clinical and Experimental Ophthalmology.