System performance in the presence of stochastic delays
暂无分享,去创建一个
System performance in the presence of additive noise has been given considerable attention in the past. With the advent of sophisticated data-processing systems using high time-bandwidth products, it becomes necessary to consider the effects of stochastic delays. Depending on the particular application, the mathematical process to be considered is either a stochastic process of a stochastic process or a function of a stochastic process. If x(t) , a stochastic process, represents the stochastic delay, and f(t) is either a stochastic process or a fixed function f(t) represents the ideal signal) then the process to be studied is g(t) = f[t + x(t)] . This paper derives various properties of g(t) and applies them to system performance. In the first section the correlation function and power spectrum of g(t) are derived in terms of the characteristic function of x(t) and the power density spectrum of f(t) where f(t) is a random process. It is shown that the effect of stochastic delays is to increase the bandwidth of the received data. In the second section, these results are applied to optimum least squares filtering, and optimum linear filters and expressions for the mean square error are derived. Important special cases are considered. In particular it is shown that with equal first-order statistics, high frequency delays cause less degradation than low frequency delays. Along somewhat similar lines, some results on random jitter in sampling are derived. The degradation due to high and low frequency jitter are compared. At high sampling rates, low frequency jitter causes more error; but when the sampling rate is matched to the bandwidth of f(t) , high frequency jitter causes greater degradation. In the final section the effect of stochastic delays on the ambiguity function are considered. Expressions are derived that show that the presence of stochastic delays degrades both time and doppler resolution.