Cognitive tracking of surgical instruments based on stereo vision and depth sensing

It is significant to have accurate localization of surgical instruments in navigated minimally invasive surgery. Moreover, instrument tracking modules are essential for cognitive surgical robotic systems. Commercial optical trackers have been developed with sub-millimeter accuracy, but typically work only at single spectrum - either visible spectrum or infrared spectrum, which limits the sensing and perception in surgical environment. The objective of this research is to incorporate multiple sensors at broad-spectrum, including stereo infrared (IR) cameras, color (or RGB) cameras and depth sensors to perceive the surgical environment. Features extracted from each modality can contribute to the cognition of complex surgical environment or procedures. Additionally, their combination can provide higher robustness and accuracy beyond what is obtained from single sensing modality. As a preliminary study, we propose a multi-sensor fusion approach for localizing surgical instruments. We developed an integrated dual Kinect tracking system to validate the proposed hierarchical tracking approach.

[1]  Li Liu,et al.  Hybrid magnetic and vision localization technique of capsule endoscope for 3D recovery of pathological tissues , 2011, 2011 9th World Congress on Intelligent Control and Automation.

[2]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[3]  Jean-Yves Bouguet,et al.  Camera calibration toolbox for matlab , 2001 .

[4]  Rolf Adams,et al.  Seeded Region Growing , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Wei Liu,et al.  A three-dimensional visual localization system based on four inexpensive video cameras , 2010, The 2010 IEEE International Conference on Information and Automation.

[6]  Peter Kazanzides,et al.  Investigation of Attitude Tracking Using an Integrated Inertial and Magnetic Navigation System for Hand-Held Surgical Instruments , 2012, IEEE/ASME Transactions on Mechatronics.

[7]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[8]  Tomás Pajdla,et al.  3D with Kinect , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[9]  Henry Fuchs,et al.  Reducing interference between multiple structured light depth sensors using motion , 2012, 2012 IEEE Virtual Reality Workshops (VRW).

[10]  Alexander Scholz,et al.  Multiple Kinect Studies , 2011 .

[11]  Shuang Song,et al.  TOWARDS OCCLUSION-RESISTANT SURGICAL INSTRUMENT TRACKING , 2013 .

[12]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[13]  Hongliang Ren,et al.  Passive Markers for Tracking Surgical Instruments in Real-Time 3-D Ultrasound Imaging , 2012, IEEE Transactions on Medical Imaging.

[14]  Alan C. Bovik,et al.  FOVEA: a foveated vergent active stereo vision system for dynamic three-dimensional scene recovery , 1998, IEEE Trans. Robotics Autom..

[15]  Andrea Fossati,et al.  Consumer Depth Cameras for Computer Vision , 2013, Advances in Computer Vision and Pattern Recognition.

[16]  P. E. Dupont,et al.  Concentric Tube Robots for Minimally Invasive Surgery , 2012 .

[17]  Kourosh Khoshelham,et al.  Accuracy analysis of kinect depth data , 2012 .

[18]  Ken Masamune,et al.  Advances in Haptics, Tactile Sensing, and Manipulation for Robot-Assisted Minimally Invasive Surgery, Noninvasive Surgery, and Diagnosis , 2012, J. Robotics.

[19]  Marcus A. Magnor,et al.  Markerless Motion Capture using multiple Color-Depth Sensors , 2011, VMV.

[20]  Uwe D. Hanebeck,et al.  Intelligent sensor-scheduling for multi-kinect-tracking , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Peter Kazanzides,et al.  Multisensor Data Fusion in an Integrated Tracking System for Endoscopic Surgery , 2012, IEEE Transactions on Information Technology in Biomedicine.

[22]  Jeffrey K. Uhlmann,et al.  General Decentralized Data Fusion With Covariance Intersection (CI) , 2001 .