Maximization of signal-to-noise ratio in optical coherence tomography using a depth-dependent matched filter

We discuss and demonstrate the dependence of noise on the signal in time-domain optical coherence tomography (TDOCT). We then derive a depth-dependent matched filter to maximize the signal-to-noise ratio at every pixel in a depth scan (A-scan). We use an empirical estimate of the second order statistics of the noise in OCT images of vascular tissue to implement a depth-dependent filter that is matched to these images. The application of our filter results in an average increase of signal-to-noise ratio of about 7 dB compared to a simple averaging operation. Our filter is not specific to time-domain OCT, but it is applicable to other types of OCT systems.

[1]  S. Sherif,et al.  Statistics of the depth-scan photocurrent in time-domain optical coherence tomography. , 2008, Journal of the Optical Society of America. A, Optics, image science, and vision.

[2]  Stephen A. Boppart,et al.  Data Analysis and Signal Postprocessing for Optical Coherence Tomography , 2008 .

[3]  D. Stifter,et al.  Beyond biomedicine: a review of alternative applications and developments for optical coherence tomography , 2007 .

[4]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[5]  Wolfgang Drexler,et al.  Optical coherence tomography: Technology and applications , 2013, 2013 Conference on Lasers & Electro-Optics Europe & International Quantum Electronics Conference CLEO EUROPE/IQEC.

[6]  Steven Kay,et al.  Fundamentals Of Statistical Signal Processing , 2001 .