DELAY ESTIMATION BY A HILBERT TRANSFORM METHOD

Summary This paper describes an estimation of the time delay between two stationary time series signals, in which an input signal is measured with little noise and an output signal is the sum of a noise and the response from a linear system. We use the Hilbert transform relation for minimum delay systems to estimate the time delay. Some computer simulation results are given to evaluate the performance of the proposed method.