Predicting the Valence of a Scene from Observers’ Eye Movements

Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images.

[1]  Denise C. Park,et al.  Culture Modulates Eye-Movements to Visual Novelty , 2009, PloS one.

[2]  D. Andina,et al.  Feature selection using Sequential Forward Selection and classification applying Artificial Metaplasticity Neural Network , 2010, IECON 2010 - 36th Annual Conference on IEEE Industrial Electronics Society.

[3]  Nicu Sebe,et al.  Emotional Valence Recognition, Analysis of Salience and Eye Movements , 2014, 2014 22nd International Conference on Pattern Recognition.

[4]  Edward R. Dougherty,et al.  Effect of separate sampling on classification accuracy , 2014, Bioinform..

[5]  D. E. Irwin Fixation location and fixation duration as indices of cognitive processing , 2004 .

[6]  Roman Feiman,et al.  Expressing fear enhances sensory acquisition , 2008, Nature Neuroscience.

[7]  Michelle R. Greene,et al.  Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns , 2012, Vision Research.

[8]  David L. Olson,et al.  Advanced Data Mining Techniques , 2008 .

[9]  Derek M. Isaacowitz,et al.  Positive mood broadens visual attention to positive stimuli , 2006, Motivation and emotion.

[10]  Gert Kootstra,et al.  Paying Attention to Symmetry , 2008, BMVC.

[11]  Chrysa D. Lithari,et al.  Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions , 2009, Brain Topography.

[12]  Garrison W. Cottrell,et al.  Predicting an observer's task using multi-fixation pattern analysis , 2014, ETRA.

[13]  Massimiliano Pontil,et al.  Support Vector Machines: Theory and Applications , 2001, Machine Learning and Its Applications.

[14]  F. L. D. Silva,et al.  EEG signal processing , 2000, Clinical Neurophysiology.

[15]  K. Mogg,et al.  Biases in eye movements to threatening facial expressions in generalized anxiety disorder and depressive disorder. , 2000, Journal of abnormal psychology.

[16]  Chih-Jen Lin,et al.  Combining SVMs with Various Feature Selection Strategies , 2006, Feature Extraction.

[17]  Thierry Baccino,et al.  Affective processing in natural scene viewing: Valence and arousal interactions in eye-fixation-related potentials , 2015, NeuroImage.

[18]  Bernhard Hommel,et al.  Threat But Not Arousal Narrows Attention: Evidence from Pupil Dilation and Saccade Control , 2011, Front. Psychology.

[19]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[20]  A. Guastella,et al.  Biased Saccadic Responses to Emotional Stimuli in Anxiety: An Antisaccade Study , 2014, PloS one.

[21]  G. Underwood,et al.  Salience of the lambs: a test of the saliency map hypothesis with pictures of emotive objects. , 2012, Journal of vision.

[22]  L. Itti,et al.  Defending Yarbus: eye movements reveal observers' task. , 2014, Journal of vision.

[23]  J. Hyönä,et al.  Eye movement assessment of selective attentional capture by emotional pictures. , 2006, Emotion.

[24]  Timothy J. Mavin,et al.  Using Pupillometry and Electromyography to Track Positive and Negative Affect During Flight Simulation , 2014 .

[25]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[26]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[28]  Yaqing Niu,et al.  Visual and emotional salience influence eye movements , 2012, TAP.

[29]  Aline Roumy,et al.  Prediction of the inter-observer visual congruency (IOVC) and application to image ranking , 2011, ACM Multimedia.

[30]  Harish Katti,et al.  An Eye Fixation Database for Saliency Detection in Images , 2010, ECCV.

[31]  Subramanian Ramanathan,et al.  Can computers learn from humans to see better?: inferring scene semantics from viewers' eye movements , 2011, ACM Multimedia.

[32]  Samia Nefti-Meziani,et al.  Predicting the Valence of a Scene from Observers’ Eye Movements , 2015, PloS one.

[33]  John M. Henderson,et al.  Predicting Cognitive State from Eye Movements , 2013, PloS one.

[34]  Sam J. Maglio,et al.  Emotional category data on images from the international affective picture system , 2005, Behavior research methods.

[35]  M A Just,et al.  A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.

[36]  Matei Mancas,et al.  Memorability of natural scenes: The role of attention , 2013, 2013 IEEE International Conference on Image Processing.

[37]  M. Bradley,et al.  Emotion and motivation II: sex differences in picture processing. , 2001, Emotion.

[38]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[39]  J. O'Regan,et al.  Fixation location effects on fixation durations during reading: an inverted optimal viewing position effect , 2001, Vision Research.

[40]  Ali Borji,et al.  Analysis of Scores, Datasets, and Models in Visual Saliency Prediction , 2013, 2013 IEEE International Conference on Computer Vision.

[41]  Derrick J. Parkhurst,et al.  Modeling the role of salience in the allocation of overt visual attention , 2002, Vision Research.

[42]  T. W. Anderson Classification by multivariate analysis , 1951 .

[43]  Samia Nefti-Meziani,et al.  A Comprehensive Review of Swarm Optimization Algorithms , 2015, PloS one.

[44]  Zenzi M. Griffin,et al.  Why Look? Reasons for Eye Movements Related to Language Production. , 2004 .

[45]  Mohan S. Kankanhalli,et al.  VIP: A Unifying Framework for Computational Eye-Gaze Research , 2013, HBU.

[46]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[47]  Jianxiong Xiao,et al.  What makes an image memorable? , 2011, CVPR 2011.