Image tracking of laparoscopic instrument using spiking neural networks

Minimally Invasive Surgery (MIS) has become more and more popular in recent years. An endoscopic image tracking system will assist surgeons to adjust the field of view autonomously in MIS. In this paper, we propose a novel image tracking algorithm based on natural features of surgical instruments. We suggest to use texture and geometric features in laparoscopic instrument imagery and to adopt a spiking neural network approach for object detection; considering color will be affected by lighting and the white balance conditions in the endoscope imagery. To enhance tracking performance, we further design a Kalman filter to combine with the neuro-based tracker. The instrument can be detected more robustly despite of deformation of the instrument image during surgery. A laparoscopic video has been tested to verify the developed methods. Experimental results show that two instruments can be distinguished and tracked simultaneously in the surgical video.

[1]  Shahram Payandeh,et al.  Toward development of 3D surgical mouse paradigm , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Arnaud Delorme,et al.  Networks of integrate-and-fire neuron using rank order coding A: How to implement spike time dependent Hebbian plasticity , 2001, Neurocomputing.

[3]  Kai-Tai Song,et al.  Autonomous and stable tracking of endoscope instrument tools with monocular camera , 2012, 2012 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM).

[4]  Philippe Zanne,et al.  Visual Servoing-Based Endoscopic Path Following for Robot-Assisted Laparoscopic Surgery , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Russell H. Taylor,et al.  Data-Driven Visual Tracking in Retinal Microsurgery , 2012, MICCAI.

[6]  Pascal Fua,et al.  A Real-Time Deformable Detector , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  L. Mettler,et al.  One year of experience working with the aid of a robotic assistant (the voice-controlled optic holder AESOP) in gynaecological endoscopic surgery. , 1998, Human reproduction.

[8]  Arnaud Delorme,et al.  Face identification using one spike per neuron: resistance to image degradations , 2001, Neural Networks.

[9]  Stefanie Speidel,et al.  Automatic classification of minimally invasive instruments based on endoscopic image sequences , 2009, Medical Imaging.

[10]  Jenny Dankelman,et al.  In-vivo real-time tracking of surgical instruments in endoscopic video , 2012, Minimally invasive therapy & allied technologies : MITAT : official journal of the Society for Minimally Invasive Therapy.

[11]  Junji Hirai,et al.  Design of a PID controller based on H∞ loop shaping method using frequency responses , 2013, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013).

[12]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[13]  Mamoru Mitsuishi,et al.  Full state visual forceps tracking under a microscope using projective contour models , 2012, 2012 IEEE International Conference on Robotics and Automation.

[14]  Li-Chen Fu,et al.  Human-oriented recognition for intelligent interactive office robot , 2013, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013).

[15]  Jean-Alexandre Long,et al.  Automatic detection of instruments in laparoscopic images: a first step towards high level command of robotized endoscopic holders , 2006 .

[16]  Saowapak S. Thongvigitmanee,et al.  Object Tracking for Laparoscopic Surgery Using theAdaptive Mean-Shift Kalman Algorithm , 2011 .

[17]  Philippe Cinquin,et al.  ViKY Robotic Scope Holder: Initial Clinical Experience and Preliminary Results Using Instrument Tracking , 2010, IEEE/ASME Transactions on Mechatronics.

[18]  Jaesoon Choi,et al.  Endoscopic vision based tracking of multiple surgical instruments in robot-assisted surgery , 2012, 2012 12th International Conference on Control, Automation and Systems.