Digit-tracking as a new tactile interface for visual perception analysis

Eye‐tracking is a valuable tool in cognitive science for measuring how visual processing resources are allocated during scene exploration. However, eye-tracking technology is largely confined to laboratory‐based settings, making it difficult to apply to large-scale studies. Here, we introduce a biologically‐inspired solution that involves presenting, on a touch‐sensitive interface, a Gaussian‐blurred image that is locally unblurred by sliding a finger over the display. Thus, the user’s finger movements provide a proxy for their eye movements and attention. We validated the method by showing strong correlations between attention maps obtained using finger-tracking vs. conventional optical eye‐tracking. Using neural networks trained to predict empirically‐derived attention maps, we established that identical high‐level features hierarchically drive explorations with either method. Finally, the diagnostic value of digit‐tracking was tested in autistic and brain‐damaged patients. Rapid yet robust measures afforded by this method open the way to large scale applications in research and clinical settings. Eye‐tracking is a valuable tool in cognitive science for measuring how attention is directed during visual scene exploration. Here, the authors introduce a new, touchscreen-based method that accomplishes the same goal via tracking finger movements.

[1]  B. Tatler,et al.  Yarbus, eye movements, and vision , 2010, i-Perception.

[2]  J. Maxwell,et al.  The Scientific Papers of James Clerk Maxwell: Experiments on Colour as perceived by the Eye, with remarks on Colour-Blindness , 2011 .

[3]  M. Carter Diagnostic and Statistical Manual of Mental Disorders, 5th ed. , 2014 .

[4]  Laurent Itti,et al.  Interesting objects are visually salient. , 2008, Journal of vision.

[5]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[6]  E. Miller,et al.  Response to Comment on "Top-Down Versus Bottom-Up Control of Attention in the Prefrontal and Posterior Parietal Cortices" , 2007, Science.

[7]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[8]  J. Piven,et al.  Visual Scanning of Faces in Autism , 2002, Journal of autism and developmental disorders.

[9]  Thomas Young,et al.  II. The Bakerian Lecture. On the theory of light and colours , 1802, Philosophical Transactions of the Royal Society of London.

[10]  B. Richmond,et al.  Implantation of magnetic search coils for measurement of eye position: An improved method , 1980, Vision Research.

[11]  Scott P. Johnson,et al.  A Critical Test of Temporal and Spatial Accuracy of the Tobii T60XL Eye Tracker. , 2012, Infancy : the official journal of the International Society on Infant Studies.

[12]  Kun Luo,et al.  Visualization of vortex shedding and particle dispersion in two-phase plate wake , 2005, J. Vis..

[13]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[14]  N. Emery,et al.  The eyes have it: the neuroethology, function and evolution of social gaze , 2000, Neuroscience & Biobehavioral Reviews.

[15]  H. Helmholtz Handbuch der physiologischen Optik , 2015 .

[16]  Núria Esteve-Gibert,et al.  Infants temporally coordinate gesture-speech combinations before they produce their first words , 2014, Speech Commun..

[17]  Catherine Lord,et al.  The Autism Diagnostic Observation Schedule, Module 4: Revised Algorithm and Standardized Severity Scores , 2014, Journal of autism and developmental disorders.

[18]  Brian A. Nosek,et al.  Power failure: why small sample size undermines the reliability of neuroscience , 2013, Nature Reviews Neuroscience.

[19]  Christof Koch,et al.  Predicting human gaze using low-level saliency combined with face detection , 2007, NIPS.

[20]  Qi Zhao,et al.  SALICON: Reducing the Semantic Gap in Saliency Prediction by Adapting Deep Neural Networks , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[21]  P. Bacchetti Small sample size is not the real problem , 2013, Nature Reviews Neuroscience.

[22]  Bingbing Ni,et al.  Half-CNN: A General Framework for Whole-Image Regression , 2014, ArXiv.

[23]  N H MACKWORTH,et al.  Eye fixations recorded on changing visual scenes by the television eye-marker. , 1958, Journal of the Optical Society of America.

[24]  Elina Birmingham,et al.  How do adults and teens with self-declared Autism Spectrum Disorder experience eye contact? A qualitative analysis of first-hand accounts , 2017, PloS one.

[25]  A. Watson A formula for human retinal ganglion cell receptive field density as a function of visual field location. , 2014, Journal of vision.

[26]  L. Young,et al.  Survey of eye movement recording methods , 1975 .

[27]  H D Crane,et al.  Accurate two-dimensional eye tracker using first and fourth Purkinje images. , 1973, Journal of the Optical Society of America.

[28]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[29]  Kim M. Dalton,et al.  Gaze fixation and the neural circuitry of face processing in autism , 2005, Nature Neuroscience.

[30]  Sébastien M. Crouzet,et al.  Fast saccades toward faces: face detection in just 100 ms. , 2010, Journal of vision.

[31]  J. Theeuwes,et al.  Top-down versus bottom-up attentional control: a failed theoretical dichotomy , 2012, Trends in Cognitive Sciences.

[32]  Nobuyuki Fujisawa,et al.  Occurrence of asymmetrical flow pattern behind an orifice in a circular pipe , 2011, J. Vis..

[33]  Maria K. Eckstein,et al.  Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? , 2016, Developmental Cognitive Neuroscience.

[34]  J. Atkinson,et al.  Development of human visual function , 2011, Vision Research.

[35]  D. Maurer,et al.  Developmental changes in the scanning of faces by young infants. , 1976, Child development.

[36]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[37]  J. Theeuwes Top-down and bottom-up control of visual selection. , 2010, Acta psychologica.

[38]  Matthias Bethge,et al.  Deep Gaze I: Boosting Saliency Prediction with Feature Maps Trained on ImageNet , 2014, ICLR.

[39]  Nicole R. Zürcher,et al.  Look me in the eyes: constraining gaze in the eye-region provokes abnormally high subcortical activation in autism , 2017, Scientific Reports.

[40]  A. Kingstone,et al.  Gaze selection in complex social scenes , 2008 .

[41]  F. Volkmar,et al.  Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. , 2002, Archives of general psychiatry.

[42]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[43]  N. Miller,et al.  THE CORNEO-RETINAL POTENTIAL DIFFERENCE AS THE BASIS OF THE GALVANOMETRIC METHOD OF RECORDING EYE MOVEMENTS , 1935 .

[44]  Denis G. Pelli,et al.  ECVP '07 Abstracts , 2007, Perception.

[45]  Patryk A. Laurent,et al.  Value-driven attentional capture , 2011, Proceedings of the National Academy of Sciences.

[46]  Michael Dorr,et al.  Large-Scale Optimization of Hierarchical Features for Saliency Prediction in Natural Images , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[47]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[48]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[49]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[50]  Frédo Durand,et al.  A Benchmark of Computational Models of Saliency to Predict Human Fixations , 2012 .

[51]  L. Kanner Autistic disturbances of affective contact. , 1968, Acta paedopsychiatrica.

[52]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[53]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[54]  D. Robinson,et al.  A METHOD OF MEASURING EYE MOVEMENT USING A SCLERAL SEARCH COIL IN A MAGNETIC FIELD. , 1963, IEEE transactions on bio-medical engineering.