A privacy-preserving approach to streaming eye-tracking data

Eye-tracking technology is being increasingly integrated into mixed reality devices. Although critical applications are being enabled, there are significant possibilities for violating user privacy expectations. We show that there is an appreciable risk of unique user identification even under natural viewing conditions in virtual reality. This identification would allow an app to connect a user's personal ID with their work ID without needing their consent, for example. To mitigate such risks we propose a framework that incorporates gatekeeping via the design of the application programming interface and via software-implemented privacy mechanisms. Our results indicate that these mechanisms can reduce the rate of identification from as much as 85% to as low as 30%. The impact of introducing these mechanisms is less than 1.5° error in gaze position for gaze prediction. Gaze data streams can thus be made private while still allowing for gaze prediction, for example, during foveated rendering. Our approach is the first to support privacy-by-design in the flow of eye-tracking data within mixed reality use cases.

[1]  Dinesh Manocha,et al.  SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction , 2019, IEEE Transactions on Visualization and Computer Graphics.

[2]  Changiz Eslahchi,et al.  Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing , 2017, Advances in cognitive psychology.

[3]  G. McCarthy,et al.  Neural basis of eye gaze processing deficits in autism. , 2005, Brain : a journal of neurology.

[4]  Yaser Sheikh,et al.  Deep appearance models for face rendering , 2018, ACM Trans. Graph..

[5]  Enkelejda Kasneci,et al.  Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework , 2019, ETRA Short Papers.

[6]  Yasuhito Sawahata,et al.  Optimizing Visual Element Placement via Visual Attention Analysis , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[7]  Hubert Ritzdorf,et al.  Analysis of the communication between colluding applications on modern smartphones , 2012, ACSAC '12.

[8]  M. Bradley Natural selective attention: orienting and emotion. , 2009, Psychophysiology.

[9]  Pawel Kasprowski,et al.  First eye movement verification and identification competition at BTAS 2012 , 2012, 2012 IEEE Fifth International Conference on Biometrics: Theory, Applications and Systems (BTAS).

[10]  Michael Backes,et al.  When Machine Unlearning Jeopardizes Privacy , 2020, CCS.

[11]  Ann McNamara,et al.  Impact of subtle gaze direction on short-term spatial information recall , 2011, SIGGRAPH '11.

[12]  Eakta Jain,et al.  Decoupling light reflex from pupillary dilation to measure emotional arousal in videos , 2016, SAP.

[13]  Maryam Keyvanara,et al.  Effect of a Constant Camera Rotation on the Visibility of Transsaccadic Camera Shifts , 2020, ETRA.

[14]  Gabriel Zachmann,et al.  Robustness of Eye Movement Biometrics Against Varying Stimuli and Varying Trajectory Length , 2020, CHI.

[15]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[16]  Kurt Debattista,et al.  A GPU based saliency map for high-fidelity selective rendering , 2006, AFRIGRAPH '06.

[17]  Trent Jaeger,et al.  Design and Implementation of a TCG-based Integrity Measurement Architecture , 2004, USENIX Security Symposium.

[18]  Jacob L. Orquin,et al.  Areas of Interest as a Signal Detection Problem in Behavioral Eye-Tracking Research , 2016 .

[19]  Thomas Neff,et al.  Shading atlas streaming , 2018, ACM Trans. Graph..

[20]  Michele Nappi,et al.  Eye movement analysis for human authentication: a critical survey , 2016, Pattern Recognit. Lett..

[21]  SungIk Cho,et al.  Effects of volumetric capture avatars on social presence in immersive virtual environments , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[22]  Kai Kunze,et al.  GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze , 2017, SIGGRAPH Posters.

[23]  Ann McNamara,et al.  Subtle gaze manipulation for improved mammography training , 2011, APGV '11.

[24]  Aurobinda Routray,et al.  A score level fusion method for eye movement biometrics , 2016, Pattern Recognit. Lett..

[25]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[26]  Thomas Armstrong,et al.  Eye tracking of attention in the affective disorders: a meta-analytic review and synthesis. , 2012, Clinical psychology review.

[27]  D. Munoz,et al.  Age-related performance of human subjects on saccadic eye movement tasks , 1998, Experimental Brain Research.

[28]  Crispin Cowan,et al.  Linux security modules: general security support for the linux kernel , 2002, Foundations of Intrusion Tolerant Systems, 2003 [Organically Assured and Survivable Information Systems].

[29]  Katarzyna Harezlak,et al.  The Second Eye Movements Verification and Identification Competition , 2014, IEEE International Joint Conference on Biometrics.

[30]  Kassem Fawaz,et al.  Kalεido: Real-Time Privacy Control for Eye-Tracking Systems , 2021, USENIX Security Symposium.

[31]  Anand K. Gramopadhye,et al.  Binocular eye tracking in virtual reality for inspection training , 2000, ETRA.

[32]  P. Mundy,et al.  A review of joint attention and social‐cognitive brain systems in typical development and autism spectrum disorder , 2018, The European journal of neuroscience.

[33]  Jeff B. Pelz,et al.  Privacy-Preserving Eye Videos using Rubber Sheet Model , 2020, ETRA Short Papers.

[34]  Oleg V. Komogortsev,et al.  Developing photo-sensor oculography (PS-OG) system for virtual reality headsets , 2018, ETRA.

[35]  Hans-Peter Seidel,et al.  Saccade landing position prediction for gaze-contingent rendering , 2017, ACM Trans. Graph..

[36]  Ivan Martinovic,et al.  28 Blinks Later: Tackling Practical Challenges of Eye Movement Biometrics , 2019, CCS.

[37]  Konrad Tollmar,et al.  Gaze-Aware Streaming Solutions for the Next Generation of Mobile VR Experiences , 2018, IEEE Transactions on Visualization and Computer Graphics.

[38]  Gordon Wetzstein,et al.  Saliency in VR: How Do People Explore Virtual Environments? , 2016, IEEE Transactions on Visualization and Computer Graphics.

[39]  Eakta Jain,et al.  EyeVEIL: degrading iris authentication in eye tracking headsets , 2019, ETRA.

[40]  Ioannis Rigas,et al.  Towards a multi-source fusion approach for eye movement-driven recognition , 2016, Inf. Fusion.

[41]  Ioannis Rigas,et al.  Current research in eye movement biometrics: An analysis based on BioEye 2015 competition , 2017, Image Vis. Comput..

[42]  Jaakko Hakulinen,et al.  Utilizing VR and Gaze Tracking to Develop AR Solutions for Industrial Maintenance , 2020, CHI.

[43]  R. Savin-Williams,et al.  Sexual arousal: The correspondence of eyes and genitals , 2015, Biological Psychology.

[44]  Steve Joordens,et al.  The eyes know what you are thinking: Eye movements as an objective measure of mind wandering , 2011, Consciousness and Cognition.

[45]  Susanne Boll,et al.  PrivacEye: privacy-preserving head-mounted eye tracking using egocentric scene image and eye movement features , 2018, ETRA.

[46]  Andreas Bulling,et al.  Privacy-aware eye tracking using differential privacy , 2018, ETRA.

[47]  Matthias Zwicker,et al.  Kernel Foveated Rendering , 2018, PACMCGIT.

[48]  Katarzyna Chawarska,et al.  Looking But Not Seeing: Atypical Visual Scanning and Recognition of Faces in 2 and 4-Year-Old Children with Autism Spectrum Disorder , 2009, Journal of autism and developmental disorders.

[49]  Oleg V. Komogortsev,et al.  Using machine learning to detect events in eye-tracking data , 2018, Behavior research methods.

[50]  Brendan John,et al.  Looking at faces: autonomous perspective invariant facial gaze analysis , 2016, SAP.

[51]  Bruno Laeng,et al.  Women's pupillary responses to sexually significant others during the hormonal cycle , 2007, Hormones and Behavior.

[52]  Eakta Jain,et al.  Differential privacy for eye-tracking data , 2019, ETRA.

[53]  Amitabh Varshney,et al.  Eye-dominance-guided Foveated Rendering , 2020, IEEE Transactions on Visualization and Computer Graphics.

[54]  Noel E. O'Connor,et al.  PathGAN: Visual Scanpath Prediction with Generative Adversarial Networks , 2018, ECCV Workshops.

[55]  Christoph W. Borst,et al.  Exploring Eye Gaze Visualization Techniques for Identifying Distracted Students in Educational VR , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[56]  Wolfgang Fuhl,et al.  Reinforcement learning for the manipulation of eye tracking data , 2020, ArXiv.

[57]  Onur Günlü,et al.  Differential privacy for eye tracking with temporal correlations , 2020, IACR Cryptol. ePrint Arch..

[58]  Nikhil Balram,et al.  Foveated Pipeline for AR/VR Head‐Mounted Displays , 2017 .

[59]  Anthony Steed,et al.  Perception of Volumetric Characters' Eye-Gaze Direction in Head-Mounted Displays , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[60]  Ali Borji,et al.  Saliency Prediction in the Deep Learning Era: Successes and Limitations , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[61]  Oleg V. Komogortsev,et al.  Body mass index moderates gaze orienting biases and pupil diameter to high and low calorie food images , 2011, Appetite.

[62]  Tamsin C. German,et al.  The Use of Eye Tracking as a Biomarker of Treatment Outcome in a Pilot Randomized Clinical Trial for Young Children with Autism , 2019, Autism research : official journal of the International Society for Autism Research.

[63]  Joohwan Kim,et al.  Towards foveated rendering for gaze-tracked virtual reality , 2016, ACM Trans. Graph..

[64]  Christos Mousas,et al.  Effects of Self-Avatar and Gaze on Avoidance Movement Behavior , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[65]  Ann McNamara,et al.  Guiding attention in controlled real-world environments , 2013, SAP.

[66]  Eakta Jain,et al.  How many words is a picture worth?: attention allocation on thumbnails versus title text regions , 2018, ETRA.

[67]  Y. Bar-Haim,et al.  Nature and Nurture in Own-Race Face Processing , 2006, Psychological science.

[68]  Eakta Jain,et al.  Look Out! A Design Framework for Safety Training Systems A Case Study on Omnidirectional Cinemagraphs , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW).

[69]  Martin Raubal,et al.  The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation , 2018, CHI.

[70]  Shenghua Gao,et al.  Gaze Prediction in Dynamic 360° Immersive Videos , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[71]  John V. Monaco Classification and authentication of one-dimensional behavioral biometrics , 2014, IEEE International Joint Conference on Biometrics.

[72]  Mohamed Khamis,et al.  GazeRecall: Using Gaze Direction to Increase Recall of Details in Cinematic Virtual Reality , 2018, MUM.

[73]  Guangtao Zhang,et al.  Accessible control of telepresence robots based on eye tracking , 2019, ETRA.

[74]  Thierry Baccino,et al.  Methods for comparing scanpaths and saliency maps: strengths and weaknesses , 2012, Behavior Research Methods.

[75]  Eakta Jain,et al.  A Benchmark of Four Methods for Generating 360° Saliency Maps from Eye Tracking Data , 2018, 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR).

[76]  Tianyi Zhang,et al.  How Old Do You Look? Inferring Your Age from Your Gaze , 2018, 2018 25th IEEE International Conference on Image Processing (ICIP).

[77]  Friedhelm Schwenker,et al.  Three learning phases for radial-basis-function networks , 2001, Neural Networks.

[78]  William A. Arbaugh,et al.  A secure and reliable bootstrap architecture , 1997, Proceedings. 1997 IEEE Symposium on Security and Privacy (Cat. No.97CB36097).

[79]  Patrick Le Callet,et al.  A Dataset of Head and Eye Movements for 360 Degree Images , 2017, MMSys.

[80]  Heinrich Hußmann,et al.  Guidance in Cinematic Virtual Reality-Taxonomy, Research Status and Challenges , 2019, Multimodal Technol. Interact..

[81]  Olivier Déforges,et al.  Salgan360: Visual Saliency Prediction On 360 Degree Images With Generative Adversarial Networks , 2018, 2018 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[82]  John Paulin Hansen,et al.  Gaze typing in virtual reality: impact of keyboard design, selection method, and motion , 2018, ETRA.

[83]  Xiangmin Xu,et al.  SalBiNet360: Saliency Prediction on 360° Images with Local-Global Bifurcated Deep Network , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[84]  Timothy J. Loving,et al.  Evaluating Virtual Reality Experiences Through Participant Choices , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[85]  Antoine Coutrot,et al.  Introducing UN Salient360! Benchmark: A platform for evaluating visual attention models for 360° contents , 2018, 2018 Tenth International Conference on Quality of Multimedia Experience (QoMEX).

[86]  Chen Li,et al.  State-of-the-Art in 360° Video/Image Processing: Perception, Assessment and Compression , 2020, IEEE Journal of Selected Topics in Signal Processing.

[87]  Dinesh Manocha,et al.  DGaze: CNN-Based Gaze Prediction in Dynamic Scenes , 2020, IEEE Transactions on Visualization and Computer Graphics.

[88]  Chen Li,et al.  Bridge the Gap Between VQA and Human Behavior on Omnidirectional Video: A Large-Scale Dataset and a Deep Learning Model , 2018, ACM Multimedia.

[89]  N. Shimizu [Neurology of eye movements]. , 2000, Rinsho shinkeigaku = Clinical neurology.

[90]  Maryam Keyvanara,et al.  Transsaccadic Awareness of Scene Transformations in a 3D Virtual Environment , 2019, SAP.

[91]  Antoine Coutrot,et al.  A dataset of head and eye movements for 360° videos , 2018, MMSys.

[92]  Oleg V. Komogortsev,et al.  Eye Movement Biometrics Using a New Dataset Collected in Virtual Reality , 2020, ETRA Adjunct.

[93]  Eakta Jain,et al.  Let It Snow: Adding pixel noise to protect the user’s identity , 2020, ETRA Adjunct.

[94]  John Morgan,et al.  Emulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Disease , 2017, IEEE Transactions on Visualization and Computer Graphics.

[95]  Eakta Jain,et al.  The Security-Utility Trade-off for Iris Authentication and Eye Animation for Social Virtual Avatars , 2020, IEEE Transactions on Visualization and Computer Graphics.

[96]  S. Blakemore,et al.  The application of eye‐tracking technology in the study of autism , 2007, The Journal of physiology.

[97]  Ann McNamara,et al.  Subtle gaze direction , 2009, TOGS.

[98]  Marcus A. Magnor,et al.  Subtle gaze guidance for immersive environments , 2017, SAP.

[99]  Jacob Leon Kröger,et al.  What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking , 2019, Privacy and Identity Management.

[100]  Georgios Papaioannou,et al.  Tile-based Caching Optimization for 360° Videos , 2019, MobiHoc.

[101]  Kim M. Dalton,et al.  Gaze fixation and the neural circuitry of face processing in autism , 2005, Nature Neuroscience.

[102]  Ioannis Agtzidis,et al.  360-degree Video Gaze Behaviour: A Ground-Truth Data Set and a Classification Algorithm for Eye Movements , 2019, ACM Multimedia.

[103]  Otmar Hilliges,et al.  Learning Cooperative Personalized Policies from Gaze Data , 2019, UIST.

[104]  Gwendal Simon,et al.  Viewport-Driven Rate-Distortion Optimized 360º Video Streaming , 2018, 2018 IEEE International Conference on Communications (ICC).