An eye tracking based virtual reality system for use inside magnetic resonance imaging systems

Patients undergoing Magnetic Resonance Imaging (MRI) often experience anxiety and sometimes distress prior to and during scanning. Here a full MRI compatible virtual reality (VR) system is described and tested with the aim of creating a radically different experience. Potential benefits could accrue from the strong sense of immersion that can be created with VR, which could create sense experiences designed to avoid the perception of being enclosed and could also provide new modes of diversion and interaction that could make even lengthy MRI examinations much less challenging. Most current VR systems rely on head mounted displays combined with head motion tracking to achieve and maintain a visceral sense of a tangible virtual world, but this technology and approach encourages physical motion, which would be unacceptable and could be physically incompatible for MRI. The proposed VR system uses gaze tracking to control and interact with a virtual world. MRI compatible cameras are used to allow real time eye tracking and robust gaze tracking is achieved through an adaptive calibration strategy in which each successive VR interaction initiated by the subject updates the gaze estimation model. A dedicated VR framework has been developed including a rich virtual world and gaze-controlled game content. To aid in achieving immersive experiences physical sensations, including noise, vibration and proprioception associated with patient table movements, have been made congruent with the presented virtual scene. A live video link allows subject-carer interaction, projecting a supportive presence into the virtual world.

[1]  Robert C. Williges,et al.  Effects of Age and Field-of-Viewon Spatial Learning in an Immersive Virtual Environment , 1998 .

[2]  P. Emmelkamp,et al.  Can virtual reality exposure therapy gains be generalized to real-life? A meta-analysis of studies applying behavioral assessments. , 2015, Behaviour research and therapy.

[3]  Soon Ki Jung,et al.  Handcrafted and Deep Trackers: Recent Visual Object Tracking Approaches and Trends , 2018 .

[4]  A. Rizzi,et al.  Social Cognition in Children Autism Spectrum Disorders: An Eye Tracking Study , 2019, Abstracts of the 47th Annual Meeting of the SENP (Société Européenne De Neurologie Pédiatrique).

[5]  M. Joos,et al.  Usability of eyetracking computer systems and impact on psychological wellbeing in patients with advanced amyotrophic lateral sclerosis , 2018, Amyotrophic lateral sclerosis & frontotemporal degeneration.

[6]  Peter Corcoran,et al.  A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms , 2017, IEEE Access.

[7]  J. Pettersson,et al.  Cognitive Ability Evaluation using Virtual Reality and Eye Tracking , 2018, 2018 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA).

[8]  Gunnar Farnebäck,et al.  Two-Frame Motion Estimation Based on Polynomial Expansion , 2003, SCIA.

[9]  Wolfgang Rosenstiel,et al.  ExCuSe: Robust Pupil Detection in Real-World Scenarios , 2015, CAIP.

[10]  Juan J. Cerrolaza,et al.  Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems , 2008, ETRA.

[11]  Silvio Savarese,et al.  Learning to Track at 100 FPS with Deep Regression Networks , 2016, ECCV.

[12]  Rui Caseiro,et al.  Ieee Transactions on Pattern Analysis and Machine Intelligence High-speed Tracking with Kernelized Correlation Filters , 2022 .

[13]  F. Murphy,et al.  Claustrophobia in magnetic resonance imaging: A systematic review and meta-analysis , 2015 .

[14]  Bruce A. Draper,et al.  Visual object tracking using adaptive correlation filters , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[15]  Jiri Matas,et al.  Forward-Backward Error: Automatic Detection of Tracking Failures , 2010, 2010 20th International Conference on Pattern Recognition.

[16]  Sven Behnke,et al.  Feature-based head pose estimation from images , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[17]  Joachim Hornegger,et al.  Self-encoded Marker for Optical Prospective Head Motion Correction in MRI , 2010, MICCAI.

[18]  Boris B. Velichkovsky,et al.  New Solution to the Midas Touch Problem: Identification of Visual Commands Via Extraction of Focal Fixations , 2014, IHCI.

[19]  Juan J. Cerrolaza,et al.  Evaluation of accurate eye corner detection methods for gaze estimation , 2014 .

[20]  E. Larsson,et al.  Impact of extended written information on patient anxiety and image motion artifacts during magnetic resonance imaging , 2006, Acta radiologica.

[21]  Krisztina L. Malisza,et al.  Reactions of young children to the MRI scanner environment , 2010, Magnetic resonance in medicine.

[22]  Pieter Blignaut,et al.  The effect of mapping function on the accuracy of a video-based eye tracker , 2013, ETSA '13.

[23]  Andrew T. Duchowski,et al.  Gaze-based interaction: A 30 year retrospective , 2018, Comput. Graph..

[24]  Jiri Matas,et al.  Discriminative Correlation Filter with Channel and Spatial Reliability , 2017, CVPR.

[25]  Yi Wu,et al.  Online Object Tracking: A Benchmark , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[26]  Joschka Mütterlein,et al.  The Three Pillars of Virtual Reality? Investigating the Roles of Immersion, Presence, and Interactivity , 2018, HICSS.

[27]  R. Baños,et al.  Recent Progress in Virtual Reality Exposure Therapy for Phobias: A Systematic Review , 2017, Current Psychiatry Reports.

[28]  Peter Robinson,et al.  Learning an appearance-based gaze estimator from one million synthesised images , 2016, ETRA.

[29]  S. Reay,et al.  Can virtual reality simulation prepare patients for an MRI experience? , 2019, Radiography.

[30]  Christophe Hurter,et al.  Improving eye-tracking calibration accuracy using symbolic regression , 2019, PloS one.

[31]  F. Sharif,et al.  Effectiveness of video information on coronary angiography patients' outcomes. , 2013, Collegian.

[32]  Dongheng Li,et al.  Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[33]  E. Kazerooni,et al.  Virtual Reality Tool Simulates MRI Experience , 2018, Tomography.

[34]  Chiuhsiang Joe Lin,et al.  Interaction and visual performance in stereoscopic displays: A review , 2015 .

[35]  Thiago Santini,et al.  PuRe: Robust pupil detection for real-time pervasive eye tracking , 2017, Comput. Vis. Image Underst..

[36]  D. Veltman,et al.  Preparing children with a mock scanner training protocol results in high quality structural and functional MRI scans , 2010, European Journal of Pediatrics.

[37]  Robert W. Lindeman,et al.  Exploring natural eye-gaze-based interaction for immersive virtual reality , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[38]  P. Mullins,et al.  Alleviating anxiety in patients prior to MRI: A pilot single-centre single-blinded randomised controlled trial to compare video demonstration or telephone conversation with a radiographer versus routine intervention. , 2017, Radiography.