Effects of task and image properties on visual-attention deployment in image-quality assessment

Abstract. It is important to understand how humans view images and how their behavior is affected by changes in the properties of the viewed images and the task they are given, particularly the task of scoring the image quality (IQ). This is a complex behavior that holds great importance for the field of image-quality research. This work builds upon 4 years of research work spanning three databases studying image-viewing behavior. Using eye-tracking equipment, it was possible to collect information on human viewing behavior of different kinds of stimuli and under different experimental settings. This work performs a cross-analysis on the results from all these databases using state-of-the-art similarity measures. The results strongly show that asking the viewers to score the IQ significantly changes their viewing behavior. Also muting the color saturation seems to affect the saliency of the images. However, a change in IQ was not consistently found to modify visual attention deployment, neither under free looking nor during scoring. These results are helpful in gaining a better understanding of image viewing behavior under different conditions. They also have important implications on work that collects subjective image-quality scores from human observers.

[1]  Nathalie Guyader,et al.  Characterizing eye movements during temporal and global quality assessment of h.264 compressed video sequences , 2012, Electronic Imaging.

[2]  M. Tinker How People Look at Pictures. , 1936 .

[3]  Alan C. Bovik,et al.  GAFFE: A Gaze-Attentive Fixation Finding Engine , 2008, IEEE Transactions on Image Processing.

[4]  Patrick Le Callet,et al.  Does where you Gaze on an Image Affect your Perception of Quality? Applying Visual Attention to Image Quality Metric , 2007, 2007 IEEE International Conference on Image Processing.

[5]  Patrick Le Callet,et al.  Overt visual attention for free-viewing and quality assessment tasks Impact of the regions of interest on a video quality metric , 2010 .

[6]  Nathan J. Anderson,et al.  Using performance efficiency for testing and optimization of visual attention models , 2011, Electronic Imaging.

[7]  Xuelong Li,et al.  Spatio-temporal salience based video quality assessment , 2010, 2010 IEEE International Conference on Systems, Man and Cybernetics.

[8]  Asha Iyer,et al.  Components of bottom-up gaze allocation in natural images , 2005, Vision Research.

[9]  Vladimir Zlokolica,et al.  Salient Motion Features for Video Quality Assessment , 2011, IEEE Transactions on Image Processing.

[10]  Stefan Winkler,et al.  Digital Video Quality: Vision Models and Metrics , 2005 .

[11]  Petros Maragos,et al.  Spatial bayesian surprise for image saliency and quality assessment , 2010, 2010 IEEE International Conference on Image Processing.

[12]  Eric C. Larson,et al.  Can visual fixation patterns improve image fidelity assessment? , 2008, 2008 15th IEEE International Conference on Image Processing.

[13]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[14]  Judith Redi,et al.  On the impact of packet-loss impairments on visual attention mechanisms , 2013, 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013).

[15]  Ulrich Engelke,et al.  Modelling saliency awareness for objective video quality assessment , 2010, 2010 Second International Workshop on Quality of Multimedia Experience (QoMEX).

[16]  Ali Borji,et al.  Quantitative Analysis of Human-Model Agreement in Visual Saliency Modeling: A Comparative Study , 2013, IEEE Transactions on Image Processing.

[17]  Nicolas Riche,et al.  Saliency and Human Fixations: State-of-the-Art and Study of Comparison Metrics , 2013, 2013 IEEE International Conference on Computer Vision.

[18]  Judith Redi,et al.  Image quality and visual attention interactions: Towards a more reliable analysis in the saliency space , 2011, 2011 Third International Workshop on Quality of Multimedia Experience.

[19]  Thierry Baccino,et al.  Methods for comparing scanpaths and saliency maps: strengths and weaknesses , 2012, Behavior Research Methods.

[20]  Glen P. Abousleman,et al.  A no-reference perceptual image sharpness metric based on saliency-weighted foveal pooling , 2008, 2008 15th IEEE International Conference on Image Processing.

[21]  Judith Redi,et al.  Studying the effect of optimizing the image quality in saliency regions at the expense of background content , 2010, Electronic Imaging.

[22]  Ingrid Heynderickx,et al.  Visual Attention in Objective Image Quality Assessment: Based on Eye-Tracking Data , 2011, IEEE Transactions on Circuits and Systems for Video Technology.

[23]  Sheila S. Hemami,et al.  No-reference image and video quality estimation: Applications and human-motivated design , 2010, Signal Process. Image Commun..

[24]  Alan C. Bovik,et al.  Visual Importance Pooling for Image Quality Assessment , 2009, IEEE Journal of Selected Topics in Signal Processing.

[25]  Miska M. Hannuksela,et al.  Perceptual quality assessment based on visual attention analysis , 2009, ACM Multimedia.

[26]  Patrick Le Callet,et al.  Task impact on the visual attention in subjective image quality assessment , 2006, 2006 14th European Signal Processing Conference.

[27]  Patrick Le Callet,et al.  Do video coding impairments disturb the visual attention deployment? , 2010, Signal Process. Image Commun..

[28]  Ingrid Heynderickx,et al.  Visual attention modeled with luminance only: from eye-tracking data to computational models , 2010 .

[29]  D. Burr,et al.  Selective suppression of the magnocellular visual pathway during saccadic eye movements , 1994, Nature.

[30]  Jukka Häkkinen,et al.  Can eye movements be quantitatively applied to image quality studies? , 2004, NordiCHI '04.

[31]  Ulrich Engelke,et al.  Visual Attention in Quality Assessment , 2011, IEEE Signal Processing Magazine.

[32]  Patrick Le Callet,et al.  A coherent computational approach to model bottom-up visual attention , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Peyman Milanfar,et al.  Visual saliency in noisy images. , 2013, Journal of vision.

[34]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[35]  Olivier Le Meur Robustness and repeatability of saliency models subjected to visual degradations , 2011, 2011 18th IEEE International Conference on Image Processing.

[36]  Judith Redi,et al.  Interactions of visual attention and quality perception , 2011, Electronic Imaging.