Attention-Aware Robotic Laparoscope Based on Fuzzy Interpretation of Eye-Gaze Patterns

Laparoscopic robots have been widely adopted in modern medical practice. However, explicitly interacting with these robots may increase the physical and cognitive load on the surgeon. An attention-aware robotic laparoscope system has been developed to free the surgeon from the technical limitations of visualization through the laparoscope. This system can implicitly recognize the surgeon’s visual attention by interpreting the surgeon’s natural eye movements using fuzzy logic and then automatically steer the laparoscope to focus on that viewing target. Experimental results show that this system can make the surgeon–robot interaction more effective, intuitive, and has the potential to make the execution of the surgery smoother and faster. [DOI: 10.1115/1.4030608]

[1]  Zhiwei Zhu,et al.  Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.

[2]  G. S. Guthart,et al.  The Intuitive Telesurgery System , 2000 .

[3]  R. Bittner,et al.  The AESOP robot system in laparoscopic surgery: Increased risk or advantage for surgeon and patient? , 2004, Surgical Endoscopy And Other Interventional Techniques.

[4]  J. Gilbert The EndoAssist robotic camera holder as an aid to the introduction of laparoscopic colorectal surgery. , 2009, Annals of the Royal College of Surgeons of England.

[5]  Elmer P. Dadios,et al.  Humanoid Robot: Design and Fuzzy Logic Control Technique for Its Intelligent Behaviors , 2012 .

[6]  Fumio Miyazaki,et al.  Selective Use of Face Gesture Interface and Instrument Tracking System for Control of a Robotic Laparoscope Positioner , 2003, MICCAI.

[7]  P. Finlay,et al.  Controlled trial of the introduction of a robotic camera assistant (Endo Assist) for laparoscopic cholecystectomy , 2002, Surgical Endoscopy And Other Interventional Techniques.

[8]  Fumio Miyazaki,et al.  Robust visual tracking of multiple surgical instruments for laparoscopic surgery , 2003, CARS.

[9]  Xiaoli Zhang,et al.  Gaze Contingent Control for a Robotic Laparoscope Holder , 2013 .

[10]  Hiroshi Ishiguro,et al.  Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Robert LIN,et al.  NOTE ON FUZZY SETS , 2014 .

[12]  Arne John Glenstrup,et al.  Eye Controlled Media: Present and Future State , 1995 .

[13]  Dan Benhamou,et al.  Laparoscopic surgery , 2001, The Lancet.

[14]  Carl A. Nelson,et al.  Kinematic Analysis and Optimization of a Novel Robot for Surgical Tool Manipulation , 2008 .

[15]  Päivi Majaranta,et al.  Eye Tracking , 2009, The Universal Access Handbook.

[16]  F. Jatene,et al.  Robotic versus human camera holding in video-assisted thoracic sympathectomy: a single blind randomized trial of efficacy and safety. , 2008, Interactive cardiovascular and thoracic surgery.

[17]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[18]  Yuan-Fang Wang,et al.  Image analysis for automated tracking in robot-assisted endoscopic surgery , 1994, Proceedings of 12th International Conference on Pattern Recognition.

[19]  Joo Hyun Kim,et al.  Layout Design using an Optimization-Based Human Energy Consumption Formulation , 2004 .

[20]  Björn Peters,et al.  Hearing loss and a supportive tactile signal in a navigation system : effects on driving behavior and eye movements , 2013 .

[21]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[22]  A. Knoll,et al.  Human-computer interfaces for interaction with surgical tools in robotic surgery , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[23]  Guang-Zhong Yang,et al.  Gaze contingent control for an articulated mechatronic laparoscope , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[24]  Luciano Boquete,et al.  EOG-based eye movements codification for human computer interaction , 2012, Expert Syst. Appl..

[25]  Xiaoli Zhang,et al.  Attention-aware robotic laparoscope for human-robot cooperative surgery , 2013, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[26]  I. Sakuma,et al.  Laparoscopic cholecystectomy using a newly developed laparoscope manipulator for 10 patients with cholelithiasis , 2005, Surgical Endoscopy And Other Interventional Techniques.

[27]  Y F Wang,et al.  A new framework for vision-enabled and robotically assisted minimally invasive surgery. , 1998, Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society.

[28]  John Kenneth Salisbury,et al.  The Intuitive/sup TM/ telesurgery system: overview and application , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[29]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  J Merchant,et al.  Remote measurement of eye direction allowing subject motion over one cubic foot of space. , 1974, IEEE transactions on bio-medical engineering.

[32]  Fumio Miyazaki,et al.  Classification, Design and Evaluation of Endoscope Robots , 2010 .

[33]  Dong-Soo Kwon,et al.  A surgical knowledge based interaction method for a laparoscopic assistant robot , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[34]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[35]  Pawel Strumillo,et al.  Eye-blink detection system for human–computer interaction , 2011, Universal Access in the Information Society.

[36]  Michael Kranzfelder,et al.  Feasibility of opto-electronic surgical instrument identification , 2009, Minimally invasive therapy & allied technologies : MITAT : official journal of the Society for Minimally Invasive Therapy.

[37]  Rodney A. Brooks,et al.  Humanoid robots , 2002, CACM.

[38]  John Paulin Hansen,et al.  Eye Movements in Gaze Interaction , 2013 .

[39]  Fumitoshi Matsuno,et al.  A Novel EOG/EEG Hybrid Human–Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control , 2015, IEEE Transactions on Biomedical Engineering.

[40]  J. Donnez,et al.  Gynecologic laparoscopic surgery with a palm-controlled laparoscope holder. , 2004, The Journal of the American Association of Gynecologic Laparoscopists.

[41]  M. O. Tokhi,et al.  Modular Fuzzy Logic Controller for Motion Control of Two-Wheeled Wheelchair , 2012 .

[42]  Dave M. Stampe,et al.  Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .

[43]  이상배,et al.  Fuzzy Control System에서 간단한 비퍼지화 방법 , 1994 .

[44]  A. Zelinsky,et al.  Intuitive interface through active 3D gaze tracking , 2005, Proceedings of the 2005 International Conference on Active Media Technology, 2005. (AMT 2005)..

[45]  Babak Karasfi,et al.  Application of Fuzzy Logic in Mobile Robot Navigation , 2012 .

[46]  Carl A. Nelson,et al.  Multipurpose surgical robot as a laparoscope assistant , 2009, Surgical Endoscopy.

[47]  J. Stolzenburg,et al.  Comparison of the FreeHand® robotic camera holder with human assistants during endoscopic extraperitoneal radical prostatectomy , 2011, BJU international.

[48]  G. Hirzinger,et al.  Real-time visual servoing for laparoscopic surgery. Controlling robot motion with color image segmentation , 1997, IEEE Engineering in Medicine and Biology Magazine.

[49]  Kohei Arai,et al.  Electric wheelchair control with gaze direction and eye blinking , 2009, Artificial Life and Robotics.

[50]  Carlos Hitoshi Morimoto,et al.  Pupil detection and tracking using multiple light sources , 2000, Image Vis. Comput..