An intelligent and autonomous endoscopic guidance system for minimally invasive surgery

The endoscopic guidance system for minimally invasive surgery presented here autonomously aligns the laparoscopic camera with the end-effectors of the surgeon's instruments. It collects information on the movements of the instruments from former interventions and can therefore predict them for autonomous guidance of the endoscopic camera. Knowledge is extracted by trajectory clustering, maximum likelihood classification and a Markov model to predict states. Alternative movements in an ongoing intervention are modeled. A first prototype of a robotic platform for minimally invasive surgery is described, which has two instrument arms, an autonomous robotic camera assistant and two haptic devices to control the instrument arms. The approach of long-term prediction and optimal camera positioning was tested in a phantom experiment with a hit rate of over 89% for predicting the movement of the end-effectors. Including this prediction for computing the camera position, leads to 29.2% less movements and to an improved visibility of the instruments.

[1]  Josep Amat,et al.  Automatic guidance of an assistant robot in laparoscopic surgery , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[2]  Gregory D. Hager,et al.  Automatic Recognition of Surgical Motions Using Statistical Modeling for Capturing Variability , 2008, MMVR.

[3]  Heinz Wörn,et al.  System Architecture for Workflow Controlled Robotic Surgery , 2009 .

[4]  Gerd Hirzinger,et al.  Automatic tracking of laparoscopic instruments by color coding , 1997, CVRMed.

[5]  Allison M. Okamura,et al.  Speed-Accuracy Characteristics of Human-Machine Cooperative Manipulation Using Virtual Fixtures With Variable Admittance , 2004, Hum. Factors.

[6]  A. Cuschieri,et al.  Influence of direction of view, target‐to‐endoscope distance and manipulation angle on endoscopic knot tying , 1997, The British journal of surgery.

[7]  G. Hirzinger,et al.  Self-guided robotic camera control for laparoscopic surgery compared with human camera control. , 1999, American journal of surgery.

[8]  J. Raczkowsky,et al.  Visual servoing with an optical tracking system and a lightweight robot for laser osteotomy , 2009, 2009 IEEE International Conference on Control and Automation.

[9]  Stefanie Speidel,et al.  Recognition of surgical skills using hidden Markov models , 2009, Medical Imaging.

[10]  Allison M. Okamura,et al.  Design considerations and human-machine performance of moving virtual fixtures , 2009, 2009 IEEE International Conference on Robotics and Automation.

[11]  Russell H. Taylor,et al.  Spatial motion constraints in medical robot using virtual fixtures generated by anatomy , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[12]  Alin Albu-Schäffer,et al.  The DLR MIRO: a versatile lightweight robot for surgical applications , 2008, Ind. Robot.

[13]  Gregory D. Hager,et al.  Dynamic Guidance with Pseudoadmittance Virtual Fixtures , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[14]  G. Buess,et al.  Robotics and systems technology for advanced endoscopic procedures: experiences in general surgery. , 1999, European journal of cardio-thoracic surgery : official journal of the European Association for Cardio-thoracic Surgery.

[15]  Louis B. Rosenberg,et al.  Virtual fixtures: Perceptual tools for telerobotic manipulation , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[16]  Heinz Wörn,et al.  Automatic and hand guided self-registration between a robot and an optical tracking system , 2009, 2009 International Conference on Advanced Robotics.

[17]  H. Wörn,et al.  Optimizing the Setup Configuration for Manual and Robotic Assisted Minimally Invasive Surgery , 2009 .