This paper describes a method to track camera motion of a real endoscope by using epipolar geometry analysis and CT derived virtual endoscopic images. A navigation system for a flexible endoscope guides medical doctors by providing navigation information during endoscope examinations. This paper tries to estimate the motion from an endoscopic video image based on epipolar geometry analysis and image registration between virtual endoscopic (VE) and real endoscopic (RE) images. The method consists of three parts: (a) direct estimation of camera motion by using epipolar geometry analysis, (b) precise estimation by using image registration, and (c) detection of bubble frames for avoiding miss-registration. First we calculate optical flow patterns from two consecutive frames. The camera motion is computed by substituting the obtained flows into the epipolar equations. Then we find the observation parameter of a virtual endoscopy system that generates the most similar endoscopic view to the current RE frame. We execute these processes for all frames of RE videos except for frames where bubbles appear. We applied the proposed method to RE videos of three patients who have CT images. The experimental results show the method can track camera motion for over 500 frames continuously in the best case.
[1]
Kensaku Mori,et al.
Automated Extraction and Visualization of Bronchus from 3D CT Images of Lung
,
1995,
CVRMed.
[2]
Gang Xu,et al.
Epipolar Geometry in Stereo, Motion and Object Recognition
,
1996,
Computational Imaging and Vision.
[3]
Ivan Bricault,et al.
Registration of real and CT-derived virtual bronchoscopic images to assist transbronchial biopsy
,
1998,
IEEE Transactions on Medical Imaging.
[4]
Kensaku Mori,et al.
Method for tracking camera motion of real endoscope by using virtual endoscopy system
,
2000,
Medical Imaging.