BubbleView: an alternative to eye-tracking for crowdsourcing image importance

We present BubbleView, a methodology to replace eye-tracking with mouse clicks. Participants are presented with a series of blurred images and click to reveal "bubbles" - small, circular areas of the image at original resolution, similar to having a confined area of focus like the eye fovea. We evaluated BubbleView on a variety of image types: information visualizations, natural images, static webpages, and graphic designs, and compared the clicks to eye fixations collected with eye-trackers in controlled lab settings. We found that BubbleView can be used to successfully approximate eye fixations on different images, and that the regions where people click using BubbleView can also be used to rank image and design elements by importance. BubbleView is designed to measure which information people consciously choose to examine, and works best for defined tasks such as describing the content of an information visualization or measuring image importance. Compared to related methodologies based on a moving-window approach, BubbleView provides more reliable and less noisy data.

[1]  Frédéric Gosselin,et al.  Bubbles: a technique to reveal the use of information in recognition tasks , 2001, Vision Research.

[2]  Eileen Kowler The role of visual and cognitive processes in the control of eye movement. , 1990, Reviews of oculomotor research.

[3]  Shuo Wang,et al.  Predicting human gaze beyond pixels. , 2014, Journal of vision.

[4]  Stephan Diehl,et al.  Comparing the Readability of Graph Layouts using Eyetracking and Task-oriented Analysis , 2009, CAe.

[5]  Linden J. Ball,et al.  An Eye Movement Analysis of Web Page Usability , 2002 .

[6]  Michael E. Holmes,et al.  Visual attention to repeated internet images: testing the scanpath theory on the world wide web , 2002, ETRA.

[7]  G. McConkie,et al.  The span of the effective stimulus during a fixation in reading , 1975 .

[8]  Ryan O. Murphy,et al.  Flashlight - Recording information acquisition online , 2011, Comput. Hum. Behav..

[9]  Pingmei Xu,et al.  TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking , 2015, ArXiv.

[10]  Laura A. Dabbish,et al.  Labeling images with a computer game , 2004, AAAI Spring Symposium: Knowledge Collection from Volunteer Contributors.

[11]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[12]  Meredith Ringel Morris,et al.  What do you see when you're surfing?: using eye tracking to predict salient regions of web pages , 2009, CHI.

[13]  David H. Laidlaw,et al.  Fauxvea: Crowdsourcing Gaze Location Estimates for Visualization Analysis Tasks , 2017, IEEE Transactions on Visualization and Computer Graphics.

[14]  Kerry Rodden,et al.  Eye-mouse coordination patterns on web search results pages , 2008, CHI Extended Abstracts.

[15]  Thierry Baccino,et al.  Methods for comparing scanpaths and saliency maps: strengths and weaknesses , 2012, Behavior Research Methods.

[16]  Aaron Hertzmann,et al.  DesignScape: Design with Interactive Layout Suggestions , 2015, CHI.

[17]  Christof Lutteroth,et al.  Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative , 2015, UIST.

[18]  Pietro Perona,et al.  Microsoft COCO: Common Objects in Context , 2014, ECCV.

[19]  Frédo Durand,et al.  Where Should Saliency Models Look Next? , 2016, ECCV.

[20]  Bing Pan,et al.  The determinants of web page viewing behavior: an eye-tracking study , 2004, ETRA.

[21]  White Tobii Eye Tracking An introduction to eye tracking and Tobii Eye Trackers , 2010 .

[22]  Joseph H. Goldberg,et al.  Computer interface evaluation using eye movements: methods and constructs , 1999 .

[23]  Ali Borji,et al.  Quantitative Analysis of Human-Model Agreement in Visual Saliency Modeling: A Comparative Study , 2013, IEEE Transactions on Image Processing.

[24]  Keith Rayner,et al.  The gaze-contingent moving window in reading: Development and review , 2014 .

[25]  Eugene Agichtein,et al.  Towards predicting web searcher gaze position from mouse movements , 2010, CHI Extended Abstracts.

[26]  Qi Zhao,et al.  SALICON: Saliency in Context , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[27]  Eugene Agichtein,et al.  ViewSer: enabling large-scale remote user studies of web search examination and interaction , 2011, SIGIR.

[28]  Sören Preibusch,et al.  Privacy considerations for a pervasive eye tracking world , 2014, UbiComp Adjunct.

[29]  Wojciech Matusik,et al.  Eye Tracking for Everyone , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[30]  Frédo Durand,et al.  A Benchmark of Computational Models of Saliency to Predict Human Fixations , 2012 .

[31]  Frédo Durand,et al.  What Do Different Evaluation Metrics Tell Us About Saliency Models? , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[32]  G. Prasad EYE TRACKING AND EYE-BASED HUMAN – COMPUTER INTERACTION , 2016 .

[33]  Dhruv Batra,et al.  Human Attention in Visual Question Answering: Do Humans and Deep Networks look at the same regions? , 2016, EMNLP.

[34]  James T. Miller,et al.  How Users View Web Pages: An Exploration of Cognitive and Perceptual Mechanisms , 2007 .

[35]  Aniket Kittur,et al.  Crowdsourcing user studies with Mechanical Turk , 2008, CHI.

[36]  Aaron Hertzmann,et al.  Learning Layouts for Single-PageGraphic Designs , 2014, IEEE Transactions on Visualization and Computer Graphics.

[37]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[38]  Noel E. O'Connor,et al.  Shallow and Deep Convolutional Networks for Saliency Prediction , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[39]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[40]  Krzysztof Z. Gajos,et al.  A Crowdsourced Alternative to Eye-tracking for Visualization Understanding , 2015, CHI Extended Abstracts.

[41]  Hanspeter Pfister,et al.  Beyond Memorability: Visualization Recognition and Recall , 2016, IEEE Transactions on Visualization and Computer Graphics.

[42]  Weidong Huang,et al.  Using eye tracking to investigate graph layout effects , 2007, 2007 6th International Asia-Pacific Symposium on Visualization.

[43]  Qi Zhao,et al.  Predicting Eye Fixations on Webpage With an Ensemble of Early Features and High-Level Representations from Deep Network , 2015, IEEE Transactions on Multimedia.

[44]  Michael J. Spivey,et al.  Eye Movements and Problem Solving , 2003, Psychological science.

[45]  Ryen W. White,et al.  No clicks, no problem: using cursor movements to understand and improve search , 2011, CHI.

[46]  Benjamin W Tatler,et al.  The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions. , 2007, Journal of vision.

[47]  L. Stark,et al.  Scanpaths in saccadic eye movements while viewing and recognizing patterns. , 1971, Vision research.

[48]  Michelle A. Borkin,et al.  Eye Fixation Metrics for Large Scale Evaluation and Comparison of Information Visualizations , 2015, ETVIS.

[49]  W. Graf,et al.  Ergonomic evaluation of user-interfaces by means of eye-movement data , 1989 .

[50]  Andrew J. Stewart,et al.  Integrating text and pictorial information: eye movements when looking at print advertisements. , 2001, Journal of experimental psychology. Applied.

[51]  Ryen W. White,et al.  User see, user point: gaze and cursor alignment in web search , 2012, CHI.

[52]  Pingmei Xu,et al.  Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces , 2016, CHI.

[53]  Jakob Nielsen,et al.  Eyetracking Web Usability , 2009 .

[54]  M. Hayhoe Advances in Relating Eye Movements and Cognition. , 2004, Infancy : the official journal of the International Society on Infant Studies.

[55]  Jonathan Krause,et al.  Fine-Grained Crowdsourcing for Fine-Grained Recognition , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[56]  Peter König,et al.  Measures and Limits of Models of Fixation Selection , 2011, PloS one.

[57]  John Paulin Hansen,et al.  Do we need eye trackers to tell where people look? , 2006, CHI Extended Abstracts.

[58]  Henrik I. Christensen,et al.  Computational visual attention systems and their cognitive foundations: A survey , 2010, TAP.

[59]  Bernhard Schölkopf,et al.  A Nonparametric Approach to Bottom-Up Visual Saliency , 2006, NIPS.

[60]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[61]  Qi Zhao,et al.  Webpage Saliency , 2014, ECCV.

[62]  Kim Marriott,et al.  A tool for tracking visual attention: The Restricted Focus Viewer , 2003, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[63]  John R. Anderson,et al.  What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing , 2001, CHI Extended Abstracts.

[64]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[65]  John K. Tsotsos,et al.  Towards the Quantitative Evaluation of Visual Attention Models Bottom−up Top-down Dynamic Static 0 0 0 , 2022 .

[66]  James Hays,et al.  WebGazer: Scalable Webcam Eye Tracking Using User Interactions , 2016, IJCAI.

[67]  R. Venkatesh Babu,et al.  DeepFix: A Fully Convolutional Neural Network for Predicting Human Eye Fixations , 2015, IEEE Transactions on Image Processing.

[68]  Isabelle Hupont,et al.  Bridging the gap between eye tracking and crowdsourcing , 2015, Electronic Imaging.

[69]  Markku Tukiainen,et al.  Effects of display blurring on the behavior of novices and experts during program debugging , 2005, CHI Extended Abstracts.

[70]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[71]  Sung-Hee Kim,et al.  Does an Eye Tracker Tell the Truth about Visualizations?: Findings while Investigating Visualizations for Decision Making , 2012, IEEE Transactions on Visualization and Computer Graphics.

[72]  Alan F. Blackwell,et al.  Restricted Focus Viewer: A Tool for Tracking Visual Attention , 2000, Diagrams.

[73]  Ali Borji,et al.  State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[74]  Peter Tarasewich,et al.  The Enhanced Restricted Focus Viewer , 2005, Int. J. Hum. Comput. Interact..