An Experimental Comparison Towards Autonomous Camera Navigation to Optimize Training in Robot Assisted Surgery

Robot-Assisted Surgery enhances vision and it can restore depth perception, but it introduces the need for learning how to tele-operatively control both the surgical tools and the endoscope. Together with the complexity of selecting the optimal viewpoint to carry out the procedure, this requires distinct training. This work proposes an autonomous camera navigation during the initial stages of training in order to optimize the learning of these skills. A user study involving 26 novice participants was carried out using the master console of the da Vinci Research Kit and a virtual reality training environment. The subjects were randomly divided into two groups: the control group that manually controlled the camera as in the current practice and the experimental group that underwent the autonomous navigation. After training, the time-accuracy metrics of the users who underwent autonomous camera navigation were significantly higher with respect to the control group. Additionally, autonomous camera navigation seemed to be capable to provide an imprinting about endoscope management.

[1]  G. Hirzinger,et al.  Self-guided robotic camera control for laparoscopic surgery compared with human camera control. , 1999, American journal of surgery.

[2]  A. Darzi,et al.  Dexterity enhancement with robotic surgery , 2004, Surgical Endoscopy And Other Interventional Techniques.

[3]  M. Schijven,et al.  Construct validity of the LapSim: Can the LapSim virtual reality simulator distinguish between novices and experts? , 2007, Surgical Endoscopy.

[4]  Thenkurussi Kesavadas,et al.  Development and validation of a composite scoring system for robot-assisted surgical training--the Robotic Skills Assessment Score. , 2013, The Journal of surgical research.

[5]  S. Maeso,et al.  Efficacy of the Da Vinci Surgical System in Abdominal Surgery Compared With That of Laparoscopy: A Systematic Review and Meta-Analysis , 2010, Annals of surgery.

[6]  Gregory Auner,et al.  Eye Gaze Tracking for Endoscopic Camera Positioning: An Application of a Hardware/Software Interface Developed to Automate Aesop , 2008, MMVR.

[7]  S. Baek,et al.  Virtual reality training improves da Vinci performance: a prospective trial. , 2013, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[8]  Jacques Felblinger,et al.  The virtual reality simulator dV-Trainer® is a valid assessment tool for robotic surgical skills , 2012, Surgical Endoscopy.

[9]  Thenkurussi Kesavadas,et al.  Construct validation of the key components of Fundamental Skills of Robotic Surgery (FSRS) curriculum--a multi-institution prospective study. , 2014, Journal of surgical education.

[10]  Abhilash Pandya,et al.  A Review of Camera Viewpoint Automation in Robotic and Laparoscopic Surgery , 2014, Robotics.

[11]  Nima Enayati,et al.  Robotic Assistance-as-Needed for Enhanced Visuomotor Learning in Surgical Robotics Training: An Experimental Study , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[12]  A. Okamura,et al.  Effects of robotic manipulators on movements of novices and surgeons , 2014, Surgical Endoscopy.

[13]  Myriam J. Curet,et al.  Viewpoint matters: objective performance metrics for surgeon endoscope control during robot-assisted surgery , 2016, Surgical Endoscopy.

[14]  A. Goh,et al.  Global evaluative assessment of robotic skills: validation of a clinical assessment tool to measure robotic surgical skills. , 2012, The Journal of urology.

[15]  A. Knoll,et al.  Toward increased autonomy in the surgical OR: needs, requests, and expectations , 2013, Surgical Endoscopy.

[16]  Peter Kazanzides,et al.  An open-source research kit for the da Vinci® Surgical System , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[17]  Anthony Jarc,et al.  Development and Validation of Objective Performance Metrics for Robot‐Assisted Radical Prostatectomy: A Pilot Study , 2018, The Journal of urology.

[18]  Lucian Panait,et al.  Construct and face validity of a virtual reality-based camera navigation curriculum. , 2012, The Journal of surgical research.

[19]  Allison M. Okamura,et al.  Uncontrolled Manifold Analysis of Arm Joint Angle Variability During Robotic Teleoperation and Freehand Movement of Surgeons and Novices , 2014, IEEE Transactions on Biomedical Engineering.

[20]  Vipul Patel,et al.  Fundamentals of robotic surgery: a course of basic robotic surgery skills based upon a 14‐society consensus template of outcomes measures and curriculum development , 2014, The international journal of medical robotics + computer assisted surgery : MRCAS.

[21]  Josep Amat,et al.  Automatic guidance of an assistant robot in laparoscopic surgery , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[22]  Heinz Wörn,et al.  An intelligent and autonomous endoscopic guidance system for minimally invasive surgery , 2011, 2011 IEEE International Conference on Robotics and Automation.

[23]  Bin Zheng,et al.  Measuring mental workload during the performance of advanced laparoscopic tasks , 2009, Surgical Endoscopy.

[24]  Shahab Eslamian,et al.  Towards the Implementation of an Autonomous Camera Algorithm on the da Vinci Platform , 2016, MMVR.

[25]  H. Jin Kim,et al.  Endoscopic Camera Manipulation planning of a surgical robot using Rapidly-Exploring Random Tree algorithm , 2015, 2015 15th International Conference on Control, Automation and Systems (ICCAS).

[26]  A. Minervini,et al.  Construct, content and face validity of the camera handling trainer (CHT): a new E-BLUS training task for 30° laparoscope navigation skills , 2016, World Journal of Urology.