GazeTap: towards hands-free interaction in the operating room

During minimally-invasive interventions, physicians need to interact with medical image data, which cannot be done while the hands are occupied. To address this challenge, we propose two interaction techniques which use gaze and foot as input modalities for hands-free interaction. To investigate the feasibility of these techniques, we created a setup consisting of a mobile eye-tracking device, a tactile floor, two laptops, and the large screen of an angiography suite. We conducted a user study to evaluate how to navigate medical images without the need for hand interaction. Both multimodal approaches, as well as a foot-only interaction technique, were compared regarding task completion time and subjective workload. The results revealed comparable performance of all methods. Selection is accomplished faster via gaze than with a foot only approach, but gaze and foot easily interfere when used at the same time. This paper contributes to HCI by providing techniques and evaluation results for combined gaze and foot interaction when standing. Our method may enable more effective computer interactions in the operating room, resulting in a more beneficial use of medical information.

[1]  Antonio Krüger,et al.  Combining touch and gaze for distant selection in a tabletop setting , 2013 .

[2]  Robert J. K. Jacob,et al.  What you look at is what you get , 2016, Interactions.

[3]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[4]  Raimund Dachselt,et al.  Look & Pedal: Hands-free Navigation in Zoomable Information Spaces through Gaze-supported Foot Input , 2015, ICMI.

[5]  Hans-Werner Gellersen,et al.  Feet movement in desktop 3D interaction , 2014, 2014 IEEE Symposium on 3D User Interfaces (3DUI).

[6]  Gabriel Zachmann,et al.  Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room , 2017, CHI Extended Abstracts.

[7]  James S. Duncan,et al.  Medical Image Analysis , 1999, IEEE Pulse.

[8]  R. Jacob The use of eye movements in human-computer interaction techniques: what you look at is what you get , 1991, TOIS.

[9]  Roope Raisamo,et al.  Appropriateness of foot interaction for non-accurate spatial tasks , 2004, CHI EA '04.

[10]  Hans-Werner Gellersen,et al.  An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation , 2015, INTERACT.

[11]  William A Rutala,et al.  Bacterial Contamination of Keyboards: Efficacy and Functional Impact of Disinfectants , 2006, Infection Control & Hospital Epidemiology.

[12]  Alzbeta Brychtova,et al.  Gaze and Feet as Additional Input Modalities for Interacting with Geospatial Interfaces , 2016 .

[13]  Christian Hansen,et al.  Touchless interaction with software in interventional radiology and surgery: a systematic literature review , 2017, International Journal of Computer Assisted Radiology and Surgery.

[14]  Terrence Fong,et al.  information processing. , 2022 .

[15]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[16]  Masanori Sugimoto,et al.  Novel interaction techniques based on a combination of hand and foot gestures in tabletop environments , 2012, APCHI '12.

[17]  F. Gordin,et al.  Bacterial Contamination of Computer Keyboards in a Teaching Hospital , 2003, Infection Control & Hospital Epidemiology.

[18]  Hans-Werner Gellersen,et al.  Interactions Under the Desk: A Characterisation of Foot Movements for Input in a Seated Position , 2015, INTERACT.

[19]  Mary Czerwinski,et al.  Dance your work away: exploring step user interfaces , 2006, CHI Extended Abstracts.

[20]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[21]  Eyal M. Reingold,et al.  Selection By Looking: A Novel Computer Interface And Its Application To Psychological Research , 1995 .

[22]  Jason Alexander,et al.  The Feet in Human--Computer Interaction , 2015, ACM Comput. Surv..

[23]  Daniel Vogel,et al.  The performance of indirect foot pointing using discrete taps and kicks while standing , 2015, Graphics Interface.

[24]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[25]  Bernhard Preim,et al.  Workflow Analysis for Interventional Neuroradiology using Frequent Pattern Mining , 2014, CURAC.

[26]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[27]  Daniel Vogel,et al.  Tap-Kick-Click: Foot Interaction for a Standing Desk , 2016, Conference on Designing Interactive Systems.

[28]  Norbert Elkmann,et al.  Sensor design and calibration of piezoresistive composite material , 2015, 2015 IEEE SENSORS.

[29]  Kenton O'Hara,et al.  Voice or Gesture in the Operating Room , 2015, CHI Extended Abstracts.

[30]  Patrick Saalfeld,et al.  Comparison of gesture and conventional interaction techniques for interventional neuroradiology , 2017, International Journal of Computer Assisted Radiology and Surgery.

[31]  Hans-Werner Gellersen,et al.  Eye drop: an interaction concept for gaze-supported point-to-point content transfer , 2013, MUM.

[32]  A. Çöltekina,et al.  GAZE AND FEET AS ADDITIONAL INPUT MODALITIES FOR INTERACTING WITH GEOSPATIAL INTERFACES , 2016 .

[33]  Thomas Pederson,et al.  Touch-less interaction with medical images using hand & foot gestures , 2013, UbiComp.

[34]  Sriram Subramanian,et al.  Putting your best foot forward: investigating real-world mappings for foot-based gestures , 2012, CHI.

[35]  Alexandre Alapetite Impact of noise and other factors on speech recognition in anaesthesia , 2008, Int. J. Medical Informatics.

[36]  Thomas Pederson,et al.  MAGIC pointing for eyewear computers , 2015, SEMWEB.

[37]  C. Gerba,et al.  Bacterial contamination of computer touch screens. , 2016, American journal of infection control.

[38]  M. Rouncefield,et al.  Touchless interaction with medical images lets surgeons maintain sterility during surgical procedures , 2013 .

[39]  Kumiko Tanaka-Ishii,et al.  Text Entry Systems: Mobility, Accessibility, Universality , 2007 .

[40]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[41]  Raimund Dachselt,et al.  Investigating gaze-supported multimodal pan and zoom , 2012, ETRA '12.

[42]  Nicola Bizzotto,et al.  Leap Motion Gesture Control With OsiriX in the Operating Room to Control Imaging , 2014, Surgical innovation.

[43]  José Saenz,et al.  A large scale tactile sensor for safe mobile robot manipulation , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[44]  Henry J. Gardner,et al.  Evidence from the surgeons: gesture control of image data displayed during surgery , 2016, Behav. Inf. Technol..

[45]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .