Towards Detection and Localisation of Instruments in Minimally Invasive Surgery

Methods for detecting and localising surgical instruments in laparoscopic images are an important element of advanced robotic and computer assisted interventions. Robotic joint encoders and sensors integrated or mounted on the instrument can provide information about the tool’s position but this often has inaccuracy when transferred to the surgeon’s point of view. Vision sensors are currently a promising approach for determining the position of instruments in the coordinate frame of the surgical camera. In this study, we propose a vision algorithm for localising the instrument’s pose in 3D leaving only rotation in the axis of the tool’s shaft as an ambiguity. We propose a probabilistic supervised classification method to detect pixels in laparoscopic images that belong to surgical tools. We then use the classifier output to initialise an energy minimisation algorithm for estimating the pose of a prior 3D model of the instrument within a level set framework. We show that the proposed method is robust against noise using simulated data and we perform quantitative validation of the algorithm compared to ground truth obtained using an optical tracker. Finally, we demonstrate the practical application of the technique on in vivo data from MIS with traditional laparoscopic and robotic instruments.

[1]  Gregory Hager,et al.  Vision-based navigation in image-guided interventions. , 2011, Annual review of biomedical engineering.

[2]  Russell H. Taylor,et al.  Unified Detection and Tracking of Instruments during Retinal Microsurgery , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Peter Kontschieder,et al.  Structured class-labels in random forests for semantic image labelling , 2011, 2011 International Conference on Computer Vision.

[4]  Austin Reiter,et al.  Feature Classification for Tracking Articulated Surgical Tools , 2012, MICCAI.

[5]  Luc Soler,et al.  Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing , 2003, IEEE Trans. Robotics Autom..

[6]  Philippe Cinquin,et al.  3D Tracking of Laparoscopic Instruments Using Statistical and Geometric Modeling , 2011, MICCAI.

[7]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[8]  Gregory D. Hager,et al.  Articulated object tracking by rendering consistent appearance parts , 2009, 2009 IEEE International Conference on Robotics and Automation.

[9]  Vincent Lepetit,et al.  Keypoint recognition using randomized trees , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Y. F. Wang,et al.  Automated instrument tracking in robotically assisted laparoscopic surgery. , 1995, Journal of image guided surgery.

[11]  Koen E. A. van de Sande,et al.  Evaluating Color Descriptors for Object and Scene Recognition , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Danail Stoyanov,et al.  Stereoscopic Scene Flow for Robotic Assisted Minimally Invasive Surgery , 2012, MICCAI.

[13]  Ian D. Reid,et al.  PWP3D: Real-time Segmentation and Tracking of 3D Objects , 2009, BMVC.

[14]  Stefanie Speidel,et al.  Automatic classification of minimally invasive instruments based on endoscopic image sequences , 2009, Medical Imaging.

[15]  Theo Gevers,et al.  Classifying color edges in video into shadow-geometry, highlight, or material transitions , 2003, IEEE Trans. Multim..

[16]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[17]  Paolo Dario,et al.  Tracking endoscopic instruments without localizer: image analysis-based approach. , 2006, Studies in health technology and informatics.

[18]  Simon J. D. Prince,et al.  Computer Vision: Models, Learning, and Inference , 2012 .

[19]  A. Darzi,et al.  Recent advances in minimal access surgery , 2002, BMJ : British Medical Journal.

[20]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[21]  Austin Reiter,et al.  Learning features on robotic surgical tools , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[22]  Donald P. Greenberg,et al.  Perceptual color spaces for computer graphics , 1980, SIGGRAPH '80.

[23]  Russell H. Taylor,et al.  Data-Driven Visual Tracking in Retinal Microsurgery , 2012, MICCAI.

[24]  Ian D. Reid,et al.  Robust Real-Time Visual Tracking Using Pixel-Wise Posteriors , 2008, ECCV.

[25]  Antanas Verikas,et al.  Mining data with random forests: A survey and results of new tests , 2011, Pattern Recognit..

[26]  P. Allen,et al.  Articulated Surgical Tool Detection Using Virtually-Rendered Templates , 2012 .

[27]  P. Allen,et al.  Marker-less Articulated Surgical Tool Detection , 2012 .

[28]  Philippe Cinquin,et al.  Automatic Detection of Instruments in Laparoscopic Images: A First Step Towards High-level Command of Robotic Endoscopic Holders , 2007, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006. BioRob 2006..

[29]  Guang-Zhong Yang,et al.  Episode Classification for the Analysis of Tissue/Instrument Interaction with Multiple Visual Cues , 2003, MICCAI.

[30]  Stefanie Speidel,et al.  Tracking of Instruments in Minimally Invasive Surgery for Surgical Skill Analysis , 2006, MIAR.