A Self-Adaptive Motion Scaling Framework for Surgical Robot Remote Control

Master–slave control is a common form of human–robot interaction for robotic surgery. To ensure seamless and intuitive control, a mechanism of self-adaptive motion scaling during teleoperaton is proposed in this letter. The operator can retain precise control when conducting delicate or complex manipulation, while the movement to a remote target is accelerated via adaptive motion scaling. The proposed framework consists of three components: 1) situation awareness, 2) skill level awareness, and 3) task awareness. The self-adaptive motion scaling ratio allows the operators to perform surgical tasks with high efficiency, forgoing the need of frequent clutching and instrument repositioning. The proposed framework has been verified on a da Vinci Research Kit to assess its usability and robustness. An in-house database is constructed for offline model training and parameter estimation, including both the kinematic data obtained from the robot and visual cues captured through the endoscope. Detailed user studies indicate that a suitable motion-scaling ratio can be obtained and adjusted online. The overall performance of the operators in terms of control efficiency and task completion is significantly improved with the proposed framework.

[1]  Guang-Zhong Yang,et al.  Implicit gaze-assisted adaptive motion scaling for highly articulated instrument manipulation , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Guang-Zhong Yang,et al.  Emerging Robotic Platforms for Minimally Invasive Surgery , 2013, IEEE Reviews in Biomedical Engineering.

[3]  R. Reznick,et al.  Objective structured assessment of technical skill (OSATS) for surgical residents , 1997, The British journal of surgery.

[4]  Gregory D. Hager,et al.  A Dataset and Benchmarks for Segmentation and Recognition of Gestures in Robotic Surgery , 2017, IEEE Transactions on Biomedical Engineering.

[5]  Guang-Zhong Yang,et al.  Gaze-Assisted Adaptive Motion Scaling Optimization Using Graded and Preference Based Bayesian Approaches , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Gregory D. Hager,et al.  Recognizing Surgical Activities with Recurrent Neural Networks , 2016, MICCAI.

[7]  Su-Lin Lee,et al.  From medical images to minimally invasive intervention: Computer assistance for robotic surgery , 2010, Comput. Medical Imaging Graph..

[8]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[9]  Michael D Klein,et al.  Automated robot‐assisted surgical skill evaluation: Predictive analytics approach , 2018, The international journal of medical robotics + computer assisted surgery : MRCAS.

[10]  Sunil M Prasad,et al.  Surgical robotics: impact of motion scaling on task performance. , 2004, Journal of the American College of Surgeons.

[11]  Yuan F. Zheng,et al.  Robotic eye-in-hand calibration by calibrating optical axis and target pattern , 1995, J. Intell. Robotic Syst..

[12]  Ralph J Damiano,et al.  Optimizing motion scaling and magnification in robotic surgery. , 2004, Surgery.

[13]  Gregory D. Hager,et al.  Sparse Hidden Markov Models for Surgical Gesture Classification and Skill Evaluation , 2012, IPCAI.

[14]  Atsushi Nakazawa,et al.  Intelligent control of neurosurgical robot MM-3 using dynamic motion scaling. , 2017, Neurosurgical focus.

[15]  Peter Kazanzides,et al.  An open-source research kit for the da Vinci® Surgical System , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Etienne Burdet,et al.  On the analysis of movement smoothness , 2015, Journal of NeuroEngineering and Rehabilitation.

[17]  J. Dankelman,et al.  Objective classification of residents based on their psychomotor laparoscopic skills , 2009, Surgical Endoscopy.

[18]  Sanju Lama,et al.  Surgical Skill Assessment Using Motion Quality and Smoothness. , 2017, Journal of surgical education.

[19]  T. Judkins,et al.  Objective evaluation of expert and novice performance during robotic surgical training tasks , 2009, Surgical Endoscopy.

[20]  Robert LIN,et al.  NOTE ON FUZZY SETS , 2014 .

[21]  Guang-Zhong Yang,et al.  Gaze gesture based human robot interaction for laparoscopic surgery , 2018, Medical Image Anal..

[22]  Ziheng Wang,et al.  SATR-DL: Improving Surgical Skill Assessment And Task Recognition In Robot-Assisted Surgery With Deep Neural Networks , 2018, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[23]  Gregory D. Hager,et al.  Towards automatic skill evaluation: detection and segmentation of robot-assisted surgical motions. , 2006 .

[24]  Gregory D. Hager,et al.  Surgical gesture classification from video and kinematic data , 2013, Medical Image Anal..