Numerical comparison of several algorithms for band-limited signal extrapolation.

In this paper we present some computer simulation results on the band-limited signal extrapolation problem. First, the performances of several existing algorithms are compared for the noise-free case. We then describe some modifications of these algorithms for computing the extrapolation when the given signal is contaminated with noise. Computer simulation results for both the noiseless and noisy cases are included. From these results, the following preliminary conclusion can be drawn: Two-step algorithms appear to give better reconstructions and require less computing time than the iterative algorithms considered in this paper.