Optical lens systems generally contain non-linear distortion artifacts that impose important limitations on the direct interpretation of the images. Image processing can be used to correct for these artifacts, but due to the calculation-intensive nature of the required distortion correction process, this is usually performed offline. This is not an option in image-based applications that operate interactively, however, where the real-time display of distortion corrected images can be vital. To this end, we propose a new technique to correct for arbitrary geometric lens distortion that uses the parallel processing power of a commercial graphics processing unit (GPU). By offloading the distortion correction process to the GPU, we can relieve the central processing unit (CPU) of doing this computationally very demanding task. We successfully implemented the full distortion correction algorithm on the GPU, thereby achieving a display rate of over 30 frames/sec for fully processed images of size 1024 × 768 pixels without the need for any additional digital image processing hardware.
[1]
Donald G. Bailey,et al.
A Real-time FPGA Implementation of a Barrel Distortion Correction Algorithm with Bilinear Interpolation
,
2003
.
[2]
Paul R. Cohen,et al.
Camera Calibration with Distortion Models and Accuracy Evaluation
,
1992,
IEEE Trans. Pattern Anal. Mach. Intell..
[3]
Rüdiger Westermann,et al.
MR image reconstruction using the GPU
,
2006,
SPIE Medical Imaging.
[4]
G C Sharp,et al.
GPU-based streaming architectures for fast cone-beam CT image reconstruction and demons deformable registration
,
2007,
Physics in medicine and biology.
[5]
Warren E. Smith,et al.
Correction of distortion in endoscope images
,
1992,
IEEE Trans. Medical Imaging.
[6]
Peter Koch,et al.
Real-time 3D rendering of optical coherence tomography volumetric data
,
2009,
European Conference on Biomedical Optics.
[7]
Paul Suetens,et al.
Accuracy of GPU-based B-spline evaluation
,
2008
.