Resampling of Band-Limited Gaussian Random Signals With Flat Power Spectrum, Available Through 1-Bit Quantized Samples

A theoretical analysis for the evaluation of the probability of error occurring in resampling a noise-like band-limited Gaussian signal with flat power spectrum available through its digitized samples, is presented. The analysis assumes the use of an ideal sine-based interpolation algorithm for the digitized signal reconstruction, which is proved to be optimum for the considered class of signals and quantization functions. The particular case of lowest order, i.e., 1 bit, quantization function, is fully treated in analytical terms and a theoretical prediction for the error probability is derived. Validation of the presented analysis is made through a comparison with numerical simulations.