Sensors management in robotic neurosurgery: The ROBOCAST project

Robot and computer-aided surgery platforms bring a variety of sensors into the operating room. These sensors generate information to be synchronized and merged for improving the accuracy and the safety of the surgical procedure for both patients and operators. In this paper, we present our work on the development of a sensor management architecture that is used is to gather and fuse data from localization systems, such as optical and electromagnetic trackers and ultrasound imaging devices. The architecture follows a modular client-server approach and was implemented within the EU-funded project ROBOCAST (FP7 ICT 215190). Furthermore it is based on very well-maintained open-source libraries such as OpenCV and Image-Guided Surgery Toolkit (IGSTK), which are supported from a worldwide community of developers and allow a significant reduction of software costs. We conducted experiments to evaluate the performance of the sensor manager module. We computed the response time needed for a client to receive tracking data or video images, and the time lag between synchronous acquisition with an optical tracker and ultrasound machine. Results showed a median delay of 1.9 ms for a client request of tracking data and about 40 ms for US images; these values are compatible with the data generation rate (20–30 Hz for tracking system and 25 fps for PAL video). Simultaneous acquisitions have been performed with an optical tracking system and US imaging device: data was aligned according to the timestamp associated with each sample and the delay was estimated with a cross-correlation study. A median value of 230 ms delay was calculated showing that realtime 3D reconstruction is not feasible (an offline temporal calibration is needed), although a slow exploration is possible. In conclusion, as far as asleep patient neurosurgery is concerned, the proposed setup is indeed useful for registration error correction because the brain shift occurs with a time constant of few tens of minutes.

[1]  Gerd Hirzinger,et al.  Robust multi sensor pose estimation for medical applications , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  K. Cleary,et al.  State of the art in surgical robotics: clinical applications and technology challenges. , 2001, Computer aided surgery : official journal of the International Society for Computer Aided Surgery.

[3]  Jason Trobaugh,et al.  The correction of stereotactic inaccuracy caused by brain shift using an intraoperative ultrasound device , 1997, CVRMed.

[4]  M. Talamini,et al.  Robotic abdominal surgery. , 2004, American journal of surgery.

[5]  Frank Lindseth,et al.  Intra-operative imaging with 3D ultrasound in neurosurgery. , 2011, Acta neurochirurgica. Supplement.

[6]  P. Kazanzides,et al.  Future Trends in Robotic Neurosurgery , 2008 .

[7]  Giancarlo Ferrigno,et al.  Miniaturized rigid probe driver with haptic loop control for neurosurgical interventions , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[8]  Mark A. Livingston,et al.  Managing latency in complex augmented reality systems , 1997, SI3D.

[9]  Pietro Cerveri,et al.  A new IGSTK-based architecture for the integration of multimodal sensors and robots in neurosurgical robotics applications , 2010, CARS 2010.

[10]  Russell H. Taylor,et al.  A miniature microsurgical instrument tip force sensor for enhanced force feedback during robot-assisted manipulation , 2003, IEEE Trans. Robotics Autom..

[11]  Daniel Kendoff,et al.  Current concepts and applications of computer navigation in orthopedic trauma surgery , 2007 .

[12]  Hervé Delingette,et al.  Robust nonrigid registration to capture brain shift from intraoperative MRI , 2005, IEEE Transactions on Medical Imaging.

[13]  P. Grunert,et al.  Computer-aided navigation in neurosurgery , 2003, Neurosurgical Review.

[14]  Jan Peirs,et al.  A micro optical force sensor for force feedback during minimally invasive robotic surgery , 2003 .