Closed-loop least mean square time-delay estimator

An LMS closed-loop time-delay estimator is presented. It uses the error between two samples of the incoming signal (the difference between the delayed signal and the reference signal passed through a known delay) as a performance index for the estimator. The LMS algorithm adaptively controls the delay so as to minimize the mean square of this error. The controlled delay is implemented using surface acoustic wave devices. Certain design conditions are applied, resulting in a unique minimum for the performance surface. It is shown that the proposed estimator is unbiased and has a small variance if the input signal occupies most of the system bandwidth. In fact, the variance depends on the input noise power and the generalized noise-to-signal power ratio, R"_{n}(0)/R"_{s}(0) , as well as on the loop gain. The analysis performed also gives a bound on the loop gain required for convergence of the estimator and predicts its rate of convergence. Computer simulation results show good agreement with the theory.