Toward Detection and Localization of Instruments in Minimally Invasive Surgery

Methods for detecting and localizing surgical instruments in laparoscopic images are an important element of advanced robotic and computer-assisted interventions. Robotic joint encoders and sensors integrated or mounted on the instrument can provide information about the tool's position, but this often has inaccuracy when transferred to the surgeon's point of view. Vision sensors are currently a promising approach for determining the position of instruments in the coordinate frame of the surgical camera. In this study, we propose a vision algorithm for localizing the instrument's pose in 3-D leaving only rotation in the axis of the tool's shaft as an ambiguity. We propose a probabilistic supervised classification method to detect pixels in laparoscopic images that belong to surgical tools. We then use the classifier output to initialize an energy minimization algorithm for estimating the pose of a prior 3-D model of the instrument within a level set framework. We show that the proposed method is robust against noise using simulated data and we perform quantitative validation of the algorithm compared to ground truth obtained using an optical tracker. Finally, we demonstrate the practical application of the technique on in vivo data from minimally invasive surgery with traditional laparoscopic and robotic instruments.

[1]  Ian D. Reid,et al.  Robust Real-Time Visual Tracking Using Pixel-Wise Posteriors , 2008, ECCV.

[2]  Ian D. Reid,et al.  PWP3D: Real-time Segmentation and Tracking of 3D Objects , 2009, BMVC.

[3]  Stefanie Speidel,et al.  Automatic classification of minimally invasive instruments based on endoscopic image sequences , 2009, Medical Imaging.

[4]  Y. F. Wang,et al.  Automated instrument tracking in robotically assisted laparoscopic surgery. , 1995, Journal of image guided surgery.

[5]  Gregory D. Hager,et al.  Articulated object tracking by rendering consistent appearance parts , 2009, 2009 IEEE International Conference on Robotics and Automation.

[6]  Austin Reiter,et al.  Feature Classification for Tracking Articulated Surgical Tools , 2012, MICCAI.

[7]  Gregory Hager,et al.  Vision-based navigation in image-guided interventions. , 2011, Annual review of biomedical engineering.

[8]  P. Allen,et al.  Articulated Surgical Tool Detection Using Virtually-Rendered Templates , 2012 .

[9]  Vincent Lepetit,et al.  Keypoint recognition using randomized trees , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Danail Stoyanov,et al.  Stereoscopic Scene Flow for Robotic Assisted Minimally Invasive Surgery , 2012, MICCAI.

[11]  Simon J. D. Prince,et al.  Computer Vision: Models, Learning, and Inference , 2012 .

[12]  Koen E. A. van de Sande,et al.  Evaluating Color Descriptors for Object and Scene Recognition , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Philippe Cinquin,et al.  3D Tracking of Laparoscopic Instruments Using Statistical and Geometric Modeling , 2011, MICCAI.

[14]  Philippe Cinquin,et al.  Automatic Detection of Instruments in Laparoscopic Images: A First Step Towards High-level Command of Robotic Endoscopic Holders , 2007, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006. BioRob 2006..

[15]  Donald P. Greenberg,et al.  Perceptual color spaces for computer graphics , 1980, SIGGRAPH '80.

[16]  Theo Gevers,et al.  Classifying color edges in video into shadow-geometry, highlight, or material transitions , 2003, IEEE Trans. Multim..

[17]  Guang-Zhong Yang,et al.  Episode Classification for the Analysis of Tissue/Instrument Interaction with Multiple Visual Cues , 2003, MICCAI.

[18]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[19]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[20]  Paolo Dario,et al.  Tracking endoscopic instruments without localizer: image analysis-based approach. , 2006, Studies in health technology and informatics.

[21]  P. Allen,et al.  Marker-less Articulated Surgical Tool Detection , 2012 .

[22]  A. Darzi,et al.  Recent advances in minimal access surgery , 2002, BMJ : British Medical Journal.

[23]  Simon J. D. Prince,et al.  Computer Vision: Index , 2012 .

[24]  Russell H. Taylor,et al.  Data-Driven Visual Tracking in Retinal Microsurgery , 2012, MICCAI.

[25]  Peter Kontschieder,et al.  Structured class-labels in random forests for semantic image labelling , 2011, 2011 International Conference on Computer Vision.

[26]  Austin Reiter,et al.  Learning features on robotic surgical tools , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[27]  Russell H. Taylor,et al.  Unified Detection and Tracking of Instruments during Retinal Microsurgery , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[28]  Stefanie Speidel,et al.  Tracking of Instruments in Minimally Invasive Surgery for Surgical Skill Analysis , 2006, MIAR.

[29]  Luc Soler,et al.  Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing , 2003, IEEE Trans. Robotics Autom..

[30]  Antanas Verikas,et al.  Mining data with random forests: A survey and results of new tests , 2011, Pattern Recognit..

[31]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.