Gaze gesture based human robot interaction for laparoscopic surgery

HighlightsA gaze contingent robotic laparoscope is presented.Bimanual tasks can be performed without the need for a camera assistant.Learned gaze gestures are used to control zooming, panning, and tilting.An online gaze calibration method is used to maintain gaze tracking accuracy.Comprehensive studies show significant improvements over using an assistant. Graphical abstract Figure. No Caption available. Abstract While minimally invasive surgery offers great benefits in terms of reduced patient trauma, bleeding, as well as faster recovery time, it still presents surgeons with major ergonomic challenges. Laparoscopic surgery requires the surgeon to bimanually control surgical instruments during the operation. A dedicated assistant is thus required to manoeuvre the camera, which is often difficult to synchronise with the surgeon’s movements. This article introduces a robotic system in which a rigid endoscope held by a robotic arm is controlled via the surgeon’s eye movement, thus forgoing the need for a camera assistant. Gaze gestures detected via a series of eye movements are used to convey the surgeon’s intention to initiate gaze contingent camera control. Hidden Markov Models (HMMs) are used for real‐time gaze gesture recognition, allowing the robotic camera to pan, tilt, and zoom, whilst immune to aberrant or unintentional eye movements. A novel online calibration method for the gaze tracker is proposed, which overcomes calibration drift and simplifies its clinical application. This robotic system has been validated by comprehensive user trials and a detailed analysis performed on usability metrics to assess the performance of the system. The results demonstrate that the surgeons can perform their tasks quicker and more efficiently when compared to the use of a camera assistant or foot switches.

[1]  Guang-Zhong Yang,et al.  Implicit gaze-assisted adaptive motion scaling for highly articulated instrument manipulation , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Adrian Park,et al.  Quantifying mental workloads of surgeons performing natural orifice transluminal endoscopic surgery (NOTES) procedures , 2011, Surgical Endoscopy.

[3]  Alin Albu-Schäffer,et al.  The DLR lightweight robot: design and control concepts for robots in human environments , 2007, Ind. Robot.

[4]  Jelena Kovačić,et al.  Intraclass correlation coefficient for grouped data. , 2014, Epidemiology.

[5]  W. D. Johnson,et al.  Intraclass Correlation Coefficient , 2006, International Encyclopedia of Statistical Science.

[6]  Qiang Ji,et al.  A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration , 2015, IEEE Transactions on Image Processing.

[7]  John Paulin Hansen,et al.  Single stroke gaze gestures , 2009, CHI Extended Abstracts.

[8]  S. Hudson,et al.  CHI '08 Extended Abstracts on Human Factors in Computing Systems , 2009, CHI 2009.

[9]  Guang-Zhong Yang,et al.  Perceptual Docking for Robotic Control , 2008, MIAR.

[10]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[11]  Pablo Varona,et al.  Controlling a Smartphone Using Gaze Gestures as the Input Mechanism , 2015, Hum. Comput. Interact..

[12]  Guang-Zhong Yang,et al.  Gaze contingent cartesian control of a robotic arm for laparoscopic surgery , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Guang-Zhong Yang,et al.  Visual search: psychophysical models, practical applications , 2002, Image Vis. Comput..

[14]  A. Knoll,et al.  Human-computer interfaces for interaction with surgical tools in robotic surgery , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[15]  Lucian Panait,et al.  Construct and face validity of a virtual reality-based camera navigation curriculum. , 2012, The Journal of surgical research.

[16]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[17]  Guang-Zhong Yang,et al.  Gaze contingent control for an articulated mechatronic laparoscope , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[18]  Su-Lin Lee,et al.  A General Framework for Context-Specific Image Segmentation Using Reinforcement Learning , 2013, IEEE Transactions on Medical Imaging.

[19]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[20]  George P. Mylonas,et al.  Gaze-contingent control for minimally invasive robotic surgery , 2006, Computer aided surgery : official journal of the International Society for Computer Aided Surgery.

[21]  Howell O. Istance,et al.  Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.

[22]  A. Rané,et al.  Initial experience with the EndoAssist camera-holding robot in laparoscopic urological surgery , 2007, Journal of robotic surgery.

[23]  Francisco B. Rodríguez,et al.  Low cost remote gaze gesture recognition in real time , 2012, Appl. Soft Comput..

[24]  R. Bittner,et al.  The AESOP robot system in laparoscopic surgery: Increased risk or advantage for surgeon and patient? , 2004, Surgical Endoscopy And Other Interventional Techniques.

[25]  Guang-Zhong Yang,et al.  Intention recognition for gaze controlled robotic minimally invasive laser ablation , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[26]  A. G. Gallagher,et al.  Construct validation of the ProMIS simulator using a novel laparoscopic suturing task , 2005, Surgical Endoscopy And Other Interventional Techniques.

[27]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[28]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[29]  E. Verdaasdonk,et al.  Objective assessment of technical surgical skills , 2010, The British journal of surgery.

[30]  Marcus Nyström,et al.  The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.

[31]  Tim Halverson,et al.  Cleaning up systematic error in eye-tracking data by using required fixation locations , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[32]  J F Amaral,et al.  Cause and prevention of electrosurgical injuries in laparoscopy. , 1995, Journal of the American College of Surgeons.

[33]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[34]  Rajesh Aggarwal,et al.  An Evaluation of the Feasibility, Validity, and Reliability of Laparoscopic Skills Assessment in the Operating Room , 2007, Annals of surgery.

[35]  Fred L. Bookstein,et al.  Principal Warps: Thin-Plate Splines and the Decomposition of Deformations , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[36]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  M. Stella Atkins,et al.  Workload assessment of surgeons: correlation between NASA TLX and blinks , 2012, Surgical Endoscopy.