An Improved Exponentiated stochastic gradient algorithm

Recently, few stochastic gradient algorithms have been proposed and they are based on cost functions that have exponential dependence on the chosen error. However, we have experienced that the proposed cost function based on exponential of the squared error does not converge always. In this paper we modify this cost function in order to assure the convergence and a new IE (Improved Exponentiated) stochastic gradient algorithm has been obtained. The resulting method has attractive properties in both stationary and abrupt-change situations.

[1]  Ning Li,et al.  NEW GRADIENT BASED VARIABLE STEP-SIZE LMS ALGORITHM , 2006 .

[2]  T. Aboulnasr,et al.  A robust variable step size LMS-type algorithm: analysis and simulations , 1995, 1995 International Conference on Acoustics, Speech, and Signal Processing.

[3]  Peter Grant,et al.  Analogue and Digital Signal Processing and Coding , 1989 .

[4]  Bernard Widrow,et al.  Adaptive switching circuits , 1988 .

[5]  C. Boukis,et al.  A Generalised Mixed Norm Stochastic Gradient Algorithm , 2007, 2007 15th International Conference on Digital Signal Processing.

[6]  Anthony G. Constantinides,et al.  A class of stochastic gradient algorithms with exponentiated error cost functions , 2009, Digit. Signal Process..

[7]  Teresa H. Y. Meng,et al.  Stochastic gradient adaptation under general error criteria , 1994, IEEE Trans. Signal Process..

[8]  C.F.N. Cowan,et al.  The convex variable step-size (CVSS) algorithm , 2000, IEEE Signal Processing Letters.

[9]  W. Y. Chen,et al.  A variable step size LMS algorithm , 1990, Proceedings of the 33rd Midwest Symposium on Circuits and Systems.

[10]  Bernard Widrow,et al.  The least mean fourth (LMF) adaptive algorithm and its family , 1984, IEEE Trans. Inf. Theory.