Orthogonal algorithm for minor and principal subspace extraction

This paper elaborates on an orthogonal version of the Oja (1992) method for the estimation of minor and principal subspace of a vector sequence. The proposed method, can extract principal components and if altered simply by the sign, it can also serve as a minor components extractor. This method has the same computational complexity as the Oja method, but it guarantees the orthogonality of the weight matrix at each iteration. Moreover, simulation results show that for minor subspace extraction the new algorithm is numerically more stable than the Oja algorithm.

[1]  Bin Yang,et al.  Projection approximation subspace tracking , 1995, IEEE Trans. Signal Process..

[2]  S. Amari,et al.  A self-stabilized minor subspace rule , 1998, IEEE Signal Processing Letters.

[3]  Z. Yosimura,et al.  Universal coefficient sequences for cohomology theories of CW-spectra. II , 1975 .

[4]  E. Oja,et al.  On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix , 1985 .

[5]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[6]  Qin Lin,et al.  A unified algorithm for principal and minor components extraction , 1998, Neural Networks.

[7]  John B. Moore,et al.  Global analysis of Oja's flow for neural networks , 1994, IEEE Trans. Neural Networks.

[8]  Victor Solo,et al.  Performance analysis of adaptive eigenanalysis algorithms , 1998, IEEE Trans. Signal Process..

[9]  Lawrence Markus,et al.  Global stability criteria for differential systems. , 1960 .

[10]  K. Abed-Meraim,et al.  Natural power method for fast subspace tracking , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[11]  Wei-Yong Yan,et al.  Global convergence of Oja's subspace algorithm for principal component extraction , 1998, IEEE Trans. Neural Networks.