The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere

Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.

[1]  Thiago Santini,et al.  EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking , 2017, VISIGRAPP.

[2]  Marija Brajčić,et al.  Learning at the Museum , 2013 .

[3]  Ross B. Girshick,et al.  Mask R-CNN , 2017, 1703.06870.

[4]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[5]  Huaizu Jiang,et al.  Face Detection with the Faster R-CNN , 2016, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[6]  Wolfgang Rosenstiel,et al.  ExCuSe: Robust Pupil Detection in Real-World Scenarios , 2015, CAIP.

[7]  Thiago Santini,et al.  ElSe: ellipse selection for robust pupil detection in real-world environments , 2015, ETRA.

[8]  David J. Calkins,et al.  Progress in Retinal and Eye Research , 2012 .

[9]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[10]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[11]  Felix Schüssel,et al.  Design and evaluation of a gaze tracking system for free-space interaction , 2016, UbiComp Adjunct.

[12]  Yusuke Sugano,et al.  Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments , 2015, ETRA.

[13]  D. Hubel,et al.  The role of fixational eye movements in visual perception , 2004, Nature Reviews Neuroscience.

[14]  Gjergji Kasneci,et al.  PupilNet: Convolutional Neural Networks for Robust Pupil Detection , 2016, ArXiv.

[15]  Dan Witzner Hansen,et al.  Parallax error in the monocular head-mounted eye trackers , 2012, UbiComp.

[16]  Thiago Santini,et al.  PuRe: Robust pupil detection for real-time pervasive eye tracking , 2017, Comput. Vis. Image Underst..

[17]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[18]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[19]  Enkelejda Kasneci Towards pervasive eye tracking , 2017, it Inf. Technol..

[20]  Neil A. Dodgson,et al.  Robust real-time pupil tracking in highly off-axis images , 2012, ETRA.

[21]  R. Wurtz,et al.  The Neurobiology of Saccadic Eye Movements , 1989 .

[22]  Nicola C. Anderson,et al.  Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults , 2017, PloS one.

[23]  Eva Mayr,et al.  In-sights into mobile learning: An exploration of mobile eye tracking methodology for learning in museums , 2009 .

[24]  Sarma B. K. Vrudhula,et al.  Performance optimal processor throttling under thermal constraints , 2007, CASES '07.

[25]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[26]  Andrew Zisserman,et al.  Deep Face Recognition , 2015, BMVC.

[27]  Jeffrey S. Shell,et al.  EyePliances: attention-seeking devices that respond to visual attention , 2003, CHI Extended Abstracts.

[28]  Thomas C. Kübler,et al.  Driving with Glaucoma: Task Performance and Gaze Movements , 2015, Optometry and vision science : official publication of the American Academy of Optometry.

[29]  Wolfgang Rosenstiel,et al.  SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies , 2016, Behavior Research Methods.

[30]  Jeff B. Pelz,et al.  Compensating for eye tracker camera movement , 2006, ETRA.

[31]  Thiago Santini,et al.  PuReST: robust pupil tracking for real-time pervasive eye tracking , 2018, ETRA.

[32]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[33]  Eva Mayr,et al.  Re-viewing the museum visitors view , 2007 .

[34]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.

[35]  Benjamin W. Tatler,et al.  Looking at Domestic Textiles: An Eye-Tracking Experiment Analysing Influences on Viewing Behaviour at Owlpen Manor , 2016 .

[36]  Tsvi Kuflik,et al.  Using Eye-Tracking for Enhancing the Museum Visit Experience , 2016, AVI.

[37]  Ian Morgan,et al.  How genetic is school myopia? , 2005, Progress in Retinal and Eye Research.

[38]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[39]  Θεοφανώ Παναγή Ανίχνευση και επεξεργασία ανθρώπινης εστίασης με την χρήση των γυαλιών παρακολούθησης ματιών TOBII PRO GLASSES 2 , 2016 .

[40]  David C. Hoaglin,et al.  Some Implementations of the Boxplot , 1989 .

[41]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[42]  Thiago Santini,et al.  CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction , 2017, CHI.

[43]  Peter Gerjets,et al.  Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom , 2017 .