Time delay estimation

A maximum likelihood (ML) estimator is derived for determining time delay between two signals observed in the presence of uncorrelated noise, under the assumptions of known signal and noise spectral characteristics. This ML estimator can be realized as a pair of receiver prefilters followed by a cross correlator. The time argument at which the correlator achieves a maximum is the delay estimate. Qualitatively, the role of the prefilters is to weight the signal passed to the correlator according to the strength of the coherence function. Other realizations of the ML processor are also discussed. The variance of a generalized correlation time delay estimator is derived when the estimate is in the neighborhood of the true delay. An example using these results is given with emphasis on the effect of erroneously specifying the frequency weighting to be employed. Limitations of the derived results are also discussed.