Global analysis of Oja's flow for neural networks

A detailed study of Oja's learning equation in neural networks is undertaken in this paper. Not only are such fundamental issues as existence, uniqueness, and representation of solutions completely resolved, but also the convergence issue is resolved. It is shown that the solution of Oja's equation is exponentially convergent to an equilibrium from any initial value. Moreover, the necessary and sufficient conditions are given on the initial value for the solution to converge to a dominant eigenspace of the associated autocorrelation matrix. As a by-product, this result confirms one of Oja's conjectures that the solution converges to the principal eigenspace from almost all initial values. Some other characteristics of the limiting solution are also revealed. These facilitate the determination of the limiting solution in advance using only the initial information. Two examples are analyzed demonstrating the explicit dependence of the limiting solution on the initial value. In another respect, it is found that Oja's equation is the gradient flow of generalized Rayleigh quotients on a Stiefel manifold.