Flipping the stimulus: Effects on scanpath coherence?

In experiments investigating dynamic tasks, it is often useful to examine eye movement scan patterns. We can present trials repeatedly and compute within-subjects/conditions similarity in order to distinguish between signal and noise in gaze data. To avoid obvious repetitions of trials, filler trials must be added to the experimental protocol, resulting in long experiments. Alternatively, trials can be modified to reduce the chances that the participant will notice the repetition, while avoiding significant changes in the scan patterns. In tasks in which the stimuli can be geometrically transformed without any loss of meaning, flipping the stimuli around either of the axes represents a candidate modification. In this study, we examined whether flipping of stimulus object trajectories around the x- and y-axes resulted in comparable scan patterns in a multiple object tracking task. We developed two new strategies for the statistical comparison of similarity between two groups of scan patterns, and then tested those strategies on artificial data. Our results suggest that although the scan patterns in flipped trials differ significantly from those in the original trials, this difference is small (as little as a 13 % increase of overall distance). Therefore, researchers could use geometric transformations to test more complex hypotheses regarding scan pattern coherence while retaining the same duration for experiments.

[1]  Michael S Gazzaniga,et al.  Hemispheric asymmetries for simple visual judgments in the split brain , 2002, Neuropsychologia.

[2]  J. Lukavský Eye movements in repeated multiple object tracking. , 2013, Journal of vision.

[3]  P. König,et al.  Spatial biases in viewing behavior. , 2014, Journal of vision.

[4]  James M. Brown,et al.  When do you look where you look? A visual field asymmetry , 2014, Vision Research.

[5]  George D. Stetten,et al.  Effectiveness of augmented-reality visualization versus cognitive mediation for learning actions in near space , 2008, TAP.

[6]  Tom Foulsham,et al.  Asymmetries in the direction of saccades during perception of scenes and fractals: Effects of image type and image features , 2010, Vision Research.

[7]  S. Kosslyn,et al.  Mental imagery acuity in the peripheral visual field. , 1980, Journal of experimental psychology. Human perception and performance.

[8]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[9]  Brian Lukoff,et al.  Testing for statistically significant differences between groups of scan patterns , 2008, ETRA.

[10]  Heinz Hügli,et al.  Assessing the contribution of color in visual attention , 2005, Comput. Vis. Image Underst..

[11]  T. Foulsham,et al.  It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach , 2012, Behavior Research Methods.

[12]  M. Chun,et al.  Contextual Cueing: Implicit Learning and Memory of Visual Context Guides Spatial Attention , 1998, Cognitive Psychology.

[13]  Tom Foulsham,et al.  Turning the world around: Patterns in saccade direction vary with picture orientation , 2008, Vision Research.

[14]  Sabrina Pitzalis,et al.  Spatial Anisotropy of Saccadic Latency in Normal Subjects and Brain-Damaged Patients , 2001, Cortex.

[15]  Michael W. Levine,et al.  The relative capabilities of the upper and lower visual hemifields , 2005, Vision Research.

[16]  Samuel B. Hutton,et al.  Trial by trial effects in the antisaccade task , 2007, Experimental Brain Research.

[17]  Alan C. Bovik,et al.  Point-of-gaze analysis reveals visual search strategies , 2004, IS&T/SPIE Electronic Imaging.

[18]  Alan C. Bovik,et al.  GAFFE: A Gaze-Attentive Fixation Finding Engine , 2008, IEEE Transactions on Image Processing.

[19]  J. W. Strien,et al.  Left–Right and Upper–Lower Visual Field Asymmetries for Face Matching, Letter Naming, and Lexical Decision , 2002, Brain and Cognition.

[20]  Yury Petrov,et al.  Asymmetries and idiosyncratic hot spots in crowding , 2011, Vision Research.

[21]  S. Yantis Multielement visual tracking: Attention and perceptual organization , 1992, Cognitive Psychology.

[22]  Laurent Itti,et al.  Applying computational tools to predict gaze direction in interactive visual environments , 2008, TAP.

[23]  Christopher A. Dickinson,et al.  Spatial asymmetries in viewing and remembering scenes: Consequences of an attentional bias? , 2009, Attention, perception & psychophysics.

[24]  Christof Koch,et al.  Components of bottom-up gaze allocation in natural scenes , 2010 .

[25]  W. Geisler,et al.  Retina-V1 model of detectability across the visual field. , 2014, Journal of vision.

[26]  R. D. Freeman,et al.  Visual acuity is better for letters in rows than in columns , 1980, Nature.

[27]  I. Evdokimidis,et al.  The antisaccade task in a sample of 2,006 young men , 2002, Experimental Brain Research.

[28]  F. Abed,et al.  Cultural Influences on Visual Scanning Patterns , 1991 .

[29]  Alexander Toet,et al.  Computational versus Psychophysical Bottom-Up Image Saliency: A Comparative Evaluation Study , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  Marcus Nyström,et al.  An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data , 2010, Behavior research methods.

[31]  Wilson S. Geisler,et al.  Simple summation rule for optimal fixation selection in visual search , 2009, Vision Research.

[32]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[33]  K. Heilman,et al.  Pseudoneglect: Effects of hemispace on a tactile line bisection task , 1980, Neuropsychologia.

[34]  Thomas Martinetz,et al.  Variability of eye movements when viewing dynamic natural scenes. , 2010, Journal of vision.

[35]  D. Levi,et al.  The two-dimensional shape of spatial interaction zones in the parafovea , 1992, Vision Research.

[36]  Iain D. Gilchrist,et al.  Visual correlates of fixation selection: effects of scale and time , 2005, Vision Research.

[37]  Hilda M. Fehd,et al.  Eye movements during multiple object tracking: Where do participants look? , 2008, Cognition.

[38]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[39]  A. Nuthmann,et al.  Time course of pseudoneglect in scene viewing , 2014, Cortex.

[40]  Denis G. Pelli,et al.  ECVP '07 Abstracts , 2007, Perception.

[41]  Marcus Nyström,et al.  A vector-based, multidimensional scanpath similarity measure , 2010, ETRA.

[42]  Hui Tang,et al.  Permutation test for groups of scanpaths using normalized Levenshtein distances and application in NMR questions , 2012, ETRA '12.

[43]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[44]  Miriam Spering,et al.  Directional asymmetries in human smooth pursuit eye movements. , 2013, Investigative ophthalmology & visual science.

[45]  Hilda M. Fehd,et al.  Looking at the center of the targets helps multiple object tracking. , 2010, Journal of vision.

[46]  Z W Pylyshyn,et al.  Tracking multiple independent targets: evidence for a parallel tracking mechanism. , 1988, Spatial vision.

[47]  M. McCourt,et al.  Pseudoneglect: a review and meta-analysis of performance factors in line bisection tasks , 2000, Neuropsychologia.

[48]  T. Foulsham,et al.  Leftward biases in picture scanning and line bisection: A gaze-contingent window study , 2013, Vision Research.

[49]  Jan Theeuwes,et al.  ScanMatch: A novel method for comparing fixation sequences , 2010, Behavior research methods.

[50]  Patrick Le Callet,et al.  A coherent computational approach to model bottom-up visual attention , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[51]  Akihiro Yagi,et al.  Contextual cueing in multiple object tracking , 2009 .

[52]  Thierry Baccino,et al.  Methods for comparing scanpaths and saliency maps: strengths and weaknesses , 2012, Behavior Research Methods.