Super-resolution restoration of continuous image sequence using the LMS algorithm

In this paper, we propose computationally efficient super-resolution restoration algorithm for blurred, noisy and down-sampled continuous image sequence. The proposed approach is based on a constrained least squares (CLS) super-resolution algorithm, applied recursively in time. An updating equation based on the instantaneous squared error gradient gives a block-LMS version algorithm. An adaptive regularization term is shown to improve the restored image sequence quality by forcing smoothness, while preserving edges. The computation complexity of the obtained algorithm is of the order O{L/sup 2/log(L)} per one output image, where L/sup 2/ is the number of pixels in the output image. Simulations carried out on test sequences prove this method to be applicable, efficient, and with very promising results. A frequency domain version (FDBLMS) of the algorithm is proposed for special cases, with further improvement in computational complexity, and rate of convergence.