Towards non-invasive patient tracking: optical image analysis for spine tracking during spinal surgery procedures

Surgical navigation systems can enhance surgeon vision and form a reliable image-guided tool for complex interventions as spinal surgery. The main prerequisite is successful patient tracking which implies optimal motion compensation. Nowadays, optical tracking systems can satisfy the need of detecting patient position during surgery, allowing navigation without the risk of damaging neurovascular structures. However, the spine is subject to vertebrae movements which can impact the accuracy of the system. The aim of this paper is to investigate the feasibility of a novel approach for offering a direct relationship to movements of the spinal vertebra during surgery. To this end, we detect and track patient spine features between different image views, captured by several optical cameras, for vertebrae rotation and displacement reconstruction. We analyze patient images acquired in a real surgical scenario by two gray-scale cameras, embedded in the flat-panel detector of the C-arm. Spine segmentation is performed and anatomical landmarks are designed and tracked between different views, while experimenting with several feature detection algorithms (e.g. SURF, MSER, etc.). The 3D positions for the matched features are reconstructed and the triangulation errors are computed for an accuracy assessment. The analysis of the triangulation accuracy reveals a mean error of 0.38 mm, which demonstrates the feasibility of spine tracking and strengthens the clinical application of optical imaging for spinal navigation.

[1]  Huiyu Zhou,et al.  Object tracking using SIFT features and mean shift , 2009, Comput. Vis. Image Underst..

[2]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[3]  Andrew Chan,et al.  Intraoperative image guidance compared with free-hand methods in adolescent idiopathic scoliosis posterior spinal surgery: a systematic review on screw-related complications and breach rates. , 2017, The spine journal : official journal of the North American Spine Society.

[4]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  X. Ning,et al.  Analysis and comparison of feature detection and matching algorithms for rovers vision navigation , 2012, 2012 8th IEEE International Symposium on Instrumentation and Control Technology (ISICT) Proceedings.

[6]  Jiri Matas,et al.  Robust wide-baseline stereo from maximally stable extremal regions , 2004, Image Vis. Comput..

[7]  Michael Söderman,et al.  Feasibility and Accuracy of Thoracolumbar Minimally Invasive Pedicle Screw Placement With Augmented Reality Navigation Technology , 2018, Spine.

[8]  Peter H. N. de With,et al.  Multispectral Image Analysis for Patient Tissue Tracking During Complex Interventions , 2018, 2018 25th IEEE International Conference on Image Processing (ICIP).

[9]  Alexander S. Pasciak,et al.  Improved Accuracy of Minimally Invasive Transpedicular Screw Placement in the Lumbar Spine With 3-Dimensional Stereotactic Image Guidance: A Comparative Meta-Analysis , 2015, Journal of spinal disorders & techniques.

[10]  Guang-Zhong Yang,et al.  Probabilistic Tracking of Affine-Invariant Anisotropic Regions , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[12]  R. Nachabe,et al.  Surgical Navigation Technology Based on Augmented Reality and Integrated 3D Intraoperative Imaging , 2016, Spine.

[13]  Richard I. Hartley,et al.  In defence of the 8-point algorithm , 1995, Proceedings of IEEE International Conference on Computer Vision.

[14]  P M Panchal,et al.  A Comparison of SIFT and SURF , 2013 .