The momentum least-mean square (MLMS) algorithm, a modified version of the well-known LMS algorithm, has recently been proposed, and an analysis of its basic convergence properties has been given. The authors revise the ranges of the MLMS algorithm's parameters, for which convergence is guaranteed, and provide precise expressions of convergence rate and steady-state performance of the algorithm under slow learning conditions. As a result, it is shown that, with Gaussian inputs and a low adaptation rate, the LMS and MLMS algorithms are equivalent, but, with inputs incorporating impulse noise components, the MLMS algorithm performs better. Due to its increased inertia, the MLMS algorithm becomes preferable for systems with inputs containing impulse noise components. At the expense of increased computational complexity, the MLMS algorithm is more stable against short-term disturbances exhibited by the filter input.<<ETX>>
[1]
J. Shynk,et al.
The LMS algorithm with momentum updating
,
1988,
1988., IEEE International Symposium on Circuits and Systems.
[2]
R. Lippmann,et al.
An introduction to computing with neural nets
,
1987,
IEEE ASSP Magazine.
[3]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[4]
Yuh-Huu Chang,et al.
Time correlation statistics of the LMS adaptive algorithm weights
,
1984,
ICASSP.
[5]
B. Widrow,et al.
Stationary and nonstationary learning characteristics of the LMS adaptive filter
,
1976,
Proceedings of the IEEE.
[6]
A. Gersho.
Adaptive equalization of highly dispersive channels for data transmission
,
1969
.
[7]
A. A. Mullin,et al.
Principles of neurodynamics
,
1962
.