A new constrained least mean square time-delay estimation system

An adaptive filter for determining the time difference in a signal between two split-array outputs is described. A least-mean-square (LMS) algorithm is used to adapt the constrained filter coefficients to samples of a sinc function. The newly configured LMS time-delay estimation model has a faster convergence speed and a superior ability to track time-varying delays. In addition, the same filter length can be used to measure a longer delay without resulting in a larger truncation error. >