Real-time image-based instrument classification for laparoscopic surgery

During laparoscopic surgery, context-aware assistance systems aim to alleviate some of the difficulties the surgeon faces. To ensure that the right information is provided at the right time, the current phase of the intervention has to be known. Real-time locating and classification the surgical tools currently in use are key components of both an activity-based phase recognition and assistance generation. In this paper, we present an image-based approach that detects and classifies tools during laparoscopic interventions in real-time. First, potential instrument bounding boxes are detected using a pixel-wise random forest segmentation. Each of these bounding boxes is then classified using a cascade of random forest. For this, multiple features, such as histograms over hue and saturation, gradients and SURF feature, are extracted from each detected bounding box. We evaluated our approach on five different videos from two different types of procedures. We distinguished between the four most common classes of instruments (LigaSure, atraumatic grasper, aspirator, clip applier) and background. Our method succesfully located up to 86% of all instruments respectively. On manually provided bounding boxes, we achieve a instrument type recognition rate of up to 58% and on automatically detected bounding boxes up to 49%. To our knowledge, this is the first approach that allows an image-based classification of surgical tools in a laparoscopic setting in real-time.

[1]  Antonio Criminisi,et al.  Object Class Segmentation using Random Forests , 2008, BMVC.

[2]  Sébastien Ourselin,et al.  Toward Detection and Localization of Instruments in Minimally Invasive Surgery , 2013, IEEE Transactions on Biomedical Engineering.

[3]  Lena Maier-Hein,et al.  Can Masses of Non-Experts Train Highly Accurate Image Classifiers? - A Crowdsourcing Approach to Instrument Segmentation in Laparoscopic Images , 2014, MICCAI.

[4]  Nassir Navab,et al.  Random Forests for Phase Detection in Surgical Workflow Analysis , 2014, IPCAI.

[5]  Pierre Jannin,et al.  Surgical process modelling: a review , 2014, International Journal of Computer Assisted Radiology and Surgery.

[6]  H. Pedrini,et al.  Dimensionality reduction through PCA over SIFT and SURF descriptors , 2013, 2012 IEEE 11th International Conference on Cybernetic Intelligent Systems (CIS).

[7]  Pascal Fua,et al.  Fast Part-Based Classification for Instrument Detection in Minimally Invasive Surgery , 2014, MICCAI.

[8]  Philippe Cinquin,et al.  Automatic Detection of Instruments in Laparoscopic Images: A First Step Towards High-level Command of Robotic Endoscopic Holders , 2007, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006. BioRob 2006..

[9]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[10]  Russell H. Taylor,et al.  Data-Driven Visual Tracking in Retinal Microsurgery , 2012, MICCAI.

[11]  Stefanie Speidel,et al.  Automatic classification of minimally invasive instruments based on endoscopic image sequences , 2009, Medical Imaging.

[12]  Rüdiger Dillmann,et al.  Knowledge-Driven Formalization of Laparoscopic Surgeries for Rule-Based Intraoperative Context-Aware Assistance , 2014, IPCAI.

[13]  Thomas Neumuth,et al.  Analysis of surgical intervention populations using generic surgical process models , 2010, International Journal of Computer Assisted Radiology and Surgery.