Adaptive algorithms using order statistic based gradient estimate

The problem of adaptive filtering in non-Gaussian noise environments is considered. Ordinary gradient-descent adaptive algorithms, such as the Least Mean Square (LMS) algorithm, suffer from the performance degradation due to the presence of non-Gaussian noise. In order to enhance the performance of the LMS algorithm in these situations, an order statistic operation has been applied to filter gradient estimates used in the LMS algorithm. This leads to a class of adaptive algorithms using order statistic based gradient estimates, which are termed the Order Statistic Least Mean Square (OSLMS) algorithms. A comprehensive study of the OSLMS algorithms is presented in this dissertation, including examinations of the convergence properties and steady state behavior. It is shown that when the input signals are independent identically distributed (iid) and symmetrically distributed, the coefficient estimates for the OSLMS algorithms converge to a small region about the optimal values. Simulations provide supporting evidence for algorithm convergence. The results also show that order statistic operation in OSLMS can reduce the variance of the gradient estimate (relative to LMS) when operating in non-Gaussian noise environments. A consequence is that in steady state, the excess mean square error can be reduced. As a measurement of performance, the mean squared coefficient error of OSLMS has been evaluated under a range of noise distributions and OS operators. Guidelines for selection of the OS operator are presented based on the study results. A study of the use of OSLMS algorithms in adaptive equalization in data transmission channels is also conducted. The performance of OSLMS relative to LMS is evaluated over numerical simulation trials, and OSLMS is shown to provide a reduction in bit error rate and to allow operation of the adaptive equalizer with lower signal to noise ratio.