Interference suppression using transform domain LMS adaptive filtering

In many communications systems, spread spectrum techniques are used to spread the original message data over a large bandwidth in order to improve system performance in the presence of wide-sense stationary interference. The extent to which such interference can he tolerated depends on the system's processing gain and may be augmented using adaptive filtering techniques. Preand post-correlation transform domain least-mean-squared (LMS) algorithms are used to suppress the interference while simultaneously minimizing the mean-square error between the received signal and the original data message. Analysis of the misadjustment noise and its effect on the decision variable statistics are evaluated and illustrated in terms of the overall system bit-error-rate (BER). Analytical and simulation results obtained in the presence of single-tone interference sources are presented.