Appeared in Proceedings of the Seventh Yale Workshop on Adaptive and Learning Systems pp Gain Adaptation Beats Least Squares

I present computational results suggesting that gain adaptation algorithms based in part on connection ist learning methods may improve over least squares and other classical parameter estimation methods for stochastic time varying linear systems The new algorithms are evaluated with respect to classical methods along three dimensions asymptotic error computational complexity and required prior knowl edge about the system The new algorithms are all of the same order of complexity as LMS methods O n where n is the dimensionality of the system whereas least squares methods and the Kalman l ter are O n The new methods also improve over the Kalman lter in that they do not require a com plete statistical model of how the system varies over time In a simple computational experiment the new methods are shown to produce asymptotic error lev els near that of the optimal Kalman lter and signif icantly below those of least squares and LMS meth ods The new methods may perform better even than the Kalman lter if there is any error in the lter s model of how the system varies over time