A data-set of piercing needle through deformable objects for Deep Learning from Demonstrations

Many robotic tasks are still teleoperated since automating them is very time consuming and expensive. Robot Learning from Demonstrations (RLfD) can reduce programming time and cost. However, conventional RLfD approaches are not directly applicable to many robotic tasks, e.g. robotic suturing with minimally invasive robots, as they require a time-consuming process of designing features from visual information. Deep Neural Networks (DNN) have emerged as useful tools for creating complex models capturing the relationship between high-dimensional observation space and low-level action/state space. Nonetheless, such approaches require a dataset suitable for training appropriate DNN models. This paper presents a dataset of inserting/piercing a needle with two arms of da Vinci Research Kit in/through soft tissues. The dataset consists of (1) 60 successful needle insertion trials with randomised desired exit points recorded by 6 high-resolution calibrated cameras, (2) the corresponding robot data, calibration parameters and (3) the commanded robot control input where all the collected data are synchronised. The dataset is designed for Deep-RLfD approaches. We also implemented several deep RLfD architectures, including simple feed-forward CNNs and different Recurrent Convolutional Networks (RCNs). Our study indicates RCNs improve the prediction accuracy of the model despite that the baseline feed-forward CNNs successfully learns the relationship between the visual information and the next step control actions of the robot. The dataset, as well as our baseline implementations of RLfD, are publicly available for bench-marking at https://github.com/imanlab/d-lfd.

[1]  Wyatt S. Newman,et al.  A novel vision guided knot-tying method for autonomous robotic surgery , 2014, 2014 IEEE International Conference on Automation Science and Engineering (CASE).

[2]  Wojciech Samek,et al.  Estimating Position & Velocity in 3D Space from Monocular Video Sequences Using a Deep Neural Network , 2017, 2017 IEEE International Conference on Computer Vision Workshops (ICCVW).

[3]  Hedyeh Rafii-Tari,et al.  A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Alireza Mirbagheri,et al.  The Sina Robotic Telesurgery System , 2020 .

[5]  Ankush Gupta,et al.  A case study of trajectory transfer through non-rigid registration for a simplified suturing scenario , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Dazhai Li,et al.  Visually Servoed Suturing for Robotic Microsurgical Keratoplasty , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Allison M. Okamura,et al.  Planning for Steerable Bevel-tip Needle Insertion Through 2D Soft Tissue with Obstacles , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[8]  Stefan Schaal,et al.  Learning tasks from a single demonstration , 1997, Proceedings of International Conference on Robotics and Automation.

[9]  Sergey Levine,et al.  Backprop KF: Learning Discriminative Deterministic State Estimators , 2016, NIPS.

[10]  Yun-Hui Liu,et al.  Dual-Arm Robotic Needle Insertion With Active Tissue Deformation for Autonomous Suturing , 2019, IEEE Robotics and Automation Letters.

[11]  Harit Pandya,et al.  Recurrent Kalman Networks: Factorized Inference in High-Dimensional Deep Feature Spaces , 2019, ICML.

[12]  Su-Lin Lee,et al.  A vision-guided dual arm sewing system for stent graft manufacturing , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[13]  Yann LeCun,et al.  Deep multi-scale video prediction beyond mean square error , 2015, ICLR.

[14]  Ziheng Wang,et al.  Deep learning with convolutional neural network for objective skill evaluation in robot-assisted surgery , 2018, International Journal of Computer Assisted Radiology and Surgery.

[15]  Peter Stone,et al.  Behavioral Cloning from Observation , 2018, IJCAI.

[16]  Nassir Navab,et al.  3D ultrasound-guided robotic steering of a flexible needle via visual servoing , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[17]  Allison M. Okamura,et al.  Feedback control for steering needles through 3D deformable tissue using helical paths , 2009, Robotics: Science and Systems.

[18]  Mohamed Medhat Gaber,et al.  Imitation Learning , 2017, ACM Comput. Surv..

[19]  Priya Sundaresan,et al.  Automated Extraction of Surgical Needles from Tissue Phantoms , 2019, 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE).

[20]  Murat Cenk Cavusoglu,et al.  Needle path planning for autonomous robotic surgical suturing , 2013, 2013 IEEE International Conference on Robotics and Automation.

[21]  John Kenneth Salisbury,et al.  The Intuitive/sup TM/ telesurgery system: overview and application , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[22]  Dewei Yang,et al.  Learning from Demonstration: Dynamical Movement Primitives Based Reusable Suturing Skill Modelling Method , 2018, 2018 Chinese Automation Congress (CAC).

[23]  Sofie G Møller,et al.  Laparoscopic Versus Robotic-assisted Suturing Performance Among Novice Surgeons: A Blinded, Cross-Over Study , 2020, Surgical laparoscopy, endoscopy & percutaneous techniques.

[24]  Mamoru Mitsuishi,et al.  Trajectory planning under different initial conditions for surgical task automation by learning from demonstration , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[25]  Nassir Navab,et al.  Needle Localization for Robot-assisted Subretinal Injection based on Deep Learning , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[26]  Kok Kiong Tan,et al.  Precision Motion Control using Nonlinear Contact Force Model in a Surgical Device , 2019, 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[27]  D. Jacofsky,et al.  Robotics in Arthroplasty: A Comprehensive Review. , 2016, The Journal of arthroplasty.

[28]  Sergey Levine,et al.  Unsupervised Learning for Physical Interaction through Video Prediction , 2016, NIPS.

[29]  Henry C. Lin,et al.  JHU-ISI Gesture and Skill Assessment Working Set ( JIGSAWS ) : A Surgical Activity Dataset for Human Motion Modeling , 2014 .

[30]  Martin A. Riedmiller,et al.  Embed to Control: A Locally Linear Latent Dynamics Model for Control from Raw Images , 2015, NIPS.

[31]  Ryan S. Decker,et al.  Supervised autonomous robotic soft tissue surgery , 2016, Science Translational Medicine.

[32]  Kyle B. Reed,et al.  Robot-Assisted Needle Steering , 2011, IEEE Robotics & Automation Magazine.

[33]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[34]  Jin U. Kang,et al.  Autonomous Laparoscopic Robotic Suturing with a Novel Actuated Suturing Tool and 3D Endoscope , 2019, 2019 International Conference on Robotics and Automation (ICRA).