An extrapolation procedure for band-limited signals

In this paper, the task of extrapolating a time-truncated version of a band-limited signal shall be considered. It will be shown that the basic extrapolation operation is feasible for only a particular subset of the class of band-limited signals (i.e., the operation is well-posed mathematically). An efficient algorithmic method for achieving the desired extrapolation on this subset is then presented. This algorithm is structured so that all necessary signal manipulations involve signals which are everywhere zero except possibly on a finite "observation time" set. As a consequence, its implementation is straightforward and can be carried out in real time. This is to be contrasted with many existing extrapolation algorithms which theoretically involve operations on signals that are nonzero for almost all values of time. Their numerical implementation thereby necessitates an error producing time-truncation and a resultant deleterious effect on the corresponding extrapolation. Using straightforward algebraic operations, a convenient one-step extrapolation procedure is next developed. This is noteworthy in that this procedure thereby enables one to effectively circumvent any potentially slow convergence rate difficulties which typically characterize extrapolation algorithms. The effectiveness of this one-step procedure is demonstrated by means of two examples.