Probabilistic neural network applied to eye tracking control to alter the direction of an endoscopic manipulator

In this study, we propose a novel endoscopic manipulation system that is controlled by a surgeon’s eye movements. The optical axis direction of the endoscopic manipulator is altered intuitively based on the surgeon’s pupil movements. A graphical user interface was developed by dividing the monitor screen into several areas with shape boundaries, so that the movement direction of the endoscope can be identified by the area gazed at by the operator. We used a probabilistic neural network (PNN) to decide the regional distribution proportion to recognize the direction in which the operator would want the endoscopic manipulator to move. The PNN model was trained by individual calibration data. We hypothesized that PNN model training could be completed immediately after calibration, which also determines the boundary of the regional distribution portion (RDP). We designed an experiment and recorded the path of direction change to verify the PNN’s effectiveness in our proposed system. All participants, including four who wore glasses, completed the requested task. Moreover, wearing glasses had no significant effect on the performance of the proposed system. Furthermore, the PNN training duration only took 2% of the entire time of the procedure to handle individual differences. We conclude that our method can handle individual differences in operators’ eyes through machine learning of personal calibration data over a short time frame, which will not take significant extra preoperative setup time.

[1]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[2]  Xiaoli Zhang,et al.  Gaze Contingent Control for a Robotic Laparoscope Holder , 2013 .

[3]  Samuel B. Hutton,et al.  Trial by trial effects in the antisaccade task , 2007, Experimental Brain Research.

[4]  Kang Ryoung Park,et al.  Gaze Tracking System for User Wearing Glasses , 2014, Sensors.

[5]  S. Horgan,et al.  Robots in laparoscopic surgery. , 2001, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[6]  P. A. Finlay,et al.  Controlling the movement of a surgical laparoscope , 1995 .

[7]  Tonia M. Young-Fadok Laparoscopic Colorectal Surgery , 2000 .

[8]  Y Kamei,et al.  Robotic Forceps Manipulator With a Novel Bending Mechanism , 2010, IEEE/ASME Transactions on Mechatronics.

[9]  Monish Aron,et al.  Single-port laparoscopic surgery in urology: initial experience. , 2008, Urology.

[10]  T. Young‐Fadok,et al.  Advanced Laparoscopic Colorectal Surgery , 2011 .

[11]  Frank I. Katch,et al.  Exercise physiology : nutrition, energy and human performance , 2015 .

[12]  Guang-Zhong Yang,et al.  Gaze contingent control for an articulated mechatronic laparoscope , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[13]  M. A. Veelen,et al.  Ergonomic assessment of neck posture in the minimally invasive surgery suite during laparoscopic cholecystectomy , 2008, Surgical Endoscopy.

[14]  Daniel J. Scott Teaching and learning laparoscopic procedures , 2013 .

[15]  Gyan Pareek,et al.  Dedicated robotics team reduces pre-surgical preparation time , 2012, Indian journal of urology : IJU : journal of the Urological Society of India.

[16]  S. Schwaitzberg,et al.  Imaging Systems in Minimally Invasive Surgery , 2001, Seminars in laparoscopic surgery.

[17]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[18]  Masakatsu G. Fujie,et al.  Development of a smart surgical robot with bended forceps for infant congenital esophageal atresia surgery , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[19]  Guang-Zhong Yang,et al.  Gaze contingent cartesian control of a robotic arm for laparoscopic surgery , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Helena Nordh Restorative components of small urban parks , 2010 .