Contribution of color in saliency model for videos

Much research has been concerned with the contribution of the low-level features of a visual scene to the deployment of visual attention. Bottom-up saliency models have been developed to predict the location of gaze according to these features. So far, color besides intensity, contrast and motion is considered as one of the primary features in computing bottom-up saliency. However, its contribution in guiding eye movements when viewing natural scenes has been debated. We investigated the contribution of color information in a bottom-up visual saliency model. The model efficiency was tested using the experimental data obtained on 45 observers who were eye-tracked while freely exploring a large dataset of color and grayscale videos. The two datasets of recorded eye positions, for grayscale and color videos, were compared with a luminance-based saliency model (Marat et al. Int J Comput Vis 82:231–243, 2009). We incorporated chrominance information to the model. Results show that color information improves the performance of the saliency model in predicting eye positions.

[1]  Roland J. Baddeley,et al.  High frequency edges (but not contrast) predict where we fixate: A Bayesian system identification analysis , 2006, Vision Research.

[2]  T. Foulsham,et al.  Comparing scanpaths during scene encoding and recognition : A multi-dimensional approach , 2012 .

[3]  O. Meur,et al.  Predicting visual fixations on video based on low-level visual features , 2007, Vision Research.

[4]  David Alleysson,et al.  Neurogeometry of color vision , 2012, Journal of Physiology-Paris.

[5]  Thomas Martinetz,et al.  Variability of eye movements when viewing dynamic natural scenes. , 2010, Journal of vision.

[6]  Eric Bruno,et al.  Robust motion estimation using spatial Gabor-like filters , 2002, Signal Process..

[7]  Nathalie Guyader,et al.  Improving Visual Saliency by Adding ‘Face Feature Map’ and ‘Center Bias’ , 2012, Cognitive Computation.

[8]  Douglas DeCarlo,et al.  Robust clustering of eye movement recordings for quantification of visual interest , 2004, ETRA.

[9]  Nathalie Guyader,et al.  Modelling Spatio-Temporal Saliency to Predict Gaze Direction for Short Videos , 2009, International Journal of Computer Vision.

[10]  Anis Rahman,et al.  Face perception in videos: Contributions to a visual saliency model and its implementation on GPUs. (La perception des visages en vidéos: Contributions à un modèle saillance visuelle et son application sur les GPU) , 2013 .

[11]  Nathalie Guyader,et al.  Parallel implementation of a spatio-temporal visual saliency model , 2010, Journal of Real-Time Image Processing.

[12]  T. Duckett VOCUS : A Visual Attention System for Object Detection and Goal-directed Search , 2010 .

[13]  A. Rahman,et al.  Influence of number, location and size of faces on gaze in video , 2014 .

[14]  Antoine Coutrot,et al.  Influence of soundtrack on eye movements during video exploration , 2012 .

[15]  Masahiro Takei,et al.  Human resource development and visualization , 2009, J. Vis..

[16]  D. W. Heeley,et al.  Cardinal directions of color space , 1982, Vision Research.

[17]  L. Itti Author address: , 1999 .

[18]  Thierry Baccino,et al.  New insights into ambient and focal visual fixations using an automatic classification algorithm , 2011, i-Perception.

[19]  G. Rousselet,et al.  Is it an animal? Is it a human face? Fast processing in upright and inverted natural scenes. , 2003, Journal of vision.

[20]  Nathalie Guyader,et al.  When viewing natural scenes, do abnormal colors impact on spatial or temporal parameters of eye movements? , 2012, Journal of vision.

[21]  Paolo Cignoni,et al.  Machine Vision and Applications Manuscript No , 2022 .

[22]  Patrick Le Callet Critères objectifs avec référence de qualité visuelle des images couleur , 2001 .

[23]  Kathy T. Mullen,et al.  Orientation selectivity in luminance and color vision assessed using 2-d bandpass filtered spatial noise , 2010 .

[24]  Peter König,et al.  What's color got to do with it? The influence of color on visual attention in different categories. , 2008, Journal of vision.

[25]  Nao Ninomiya,et al.  The 10th anniversary of journal of visualization , 2007, J. Vis..

[26]  John M. Henderson,et al.  Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion , 2011, Cognitive Computation.

[27]  K. Gegenfurtner,et al.  Cortical mechanisms of colour vision , 2003, Nature Reviews Neuroscience.

[28]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[29]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[30]  K. Mullen,et al.  Orientation selectivity in luminance and color vision assessed using 2-d band-pass filtered spatial noise , 2005, Vision Research.

[31]  S. Yantis,et al.  Visual Attention: Bottom-Up Versus Top-Down , 2004, Current Biology.

[32]  Pierre Baldi,et al.  Bayesian surprise attracts human attention , 2005, Vision Research.

[33]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[34]  Andreas Bulling,et al.  Introduction to the PETMEI special issue , 2014 .