Fast principal component extraction by a weighted information criterion

Principal component analysis (PCA) is an essential technique in data compression and feature extraction, and there has been much interest in developing fast PICA algorithms. On the basis of the concepts of both weighted subspace and information maximization, this paper proposes a weighted information criterion (WINC) for searching the optimal solution of a linear neural network. We analytically show that the optimum weights globally asymptotically converge to the principal eigenvectors of a stationary vector stochastic process. We establish a dependent relation of choosing the weighting matrix on statistics of the input process through the analysis of stability of the equilibrium of the proposed criterion. Therefore, we are able to reveal the constraint on the choice of a weighting matrix. We develop two adaptive algorithms based on the WINC for extracting in parallel multiple principal components. Both algorithms are able to provide adaptive step size, which leads to a significant improvement in the learning performance. Furthermore, the recursive least squares (RLS) version of WINC algorithms has a low computational complexity O(Np), where N is the input vector dimension, and p is the number of desired principal components. In fact, the WINC algorithm corresponds to a three-layer linear neural network model capable of performing, in parallel, the extraction of multiple principal components. The WINC algorithm also generalizes some well-known PCA/PSA algorithms just by adjusting the corresponding parameters. Since the weighting matrix does not require an accurate value, it facilitates the system design of the WINC algorithm for practical applications. The accuracy and speed advantages of the WINC algorithm are verified through simulations.

[1]  Mahmood R. Azimi-Sadjadi,et al.  Principal component extraction using recursive least squares learning , 1995, IEEE Trans. Neural Networks.

[2]  Zheng Bao,et al.  Fast principal component extraction by a homogeneous neural network , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).

[3]  Mark D. Plumbley Lyapunov functions for convergence of principal component algorithms , 1995, Neural Networks.

[4]  Erkki Oja,et al.  Principal component analysis by homogeneous neural networks, part II: Analysis and extentions of the learning algorithm , 1992 .

[5]  Bin Yang,et al.  Projection approximation subspace tracking , 1995, IEEE Trans. Signal Process..

[6]  Bin Yang,et al.  Asymptotic convergence analysis of the projection approximation subspace tracking algorithms , 1996, Signal Process..

[7]  John B. Moore,et al.  Global analysis of Oja's flow for neural networks , 1994, IEEE Trans. Neural Networks.

[8]  S.Y. Kung,et al.  Adaptive Principal component EXtraction (APEX) and applications , 1994, IEEE Trans. Signal Process..

[9]  Edwin K. P. Chong,et al.  On relative convergence properties of principal component analysis algorithms , 1998, IEEE Trans. Neural Networks.

[10]  Jean Pierre Delmas,et al.  Asymptotic performance analysis of subspace adaptive algorithms introduced in the neural network literature , 1998, IEEE Trans. Signal Process..

[11]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[12]  J. Rubner,et al.  A Self-Organizing Network for Principal-Component Analysis , 1989 .

[13]  Lei Xu,et al.  Least mean square error reconstruction principle for self-organizing neural-nets , 1993, Neural Networks.

[14]  J. Magnus,et al.  Matrix Differential Calculus with Applications in Statistics and Econometrics (Revised Edition) , 1999 .

[15]  Simon Haykin,et al.  Adaptive filter theory (2nd ed.) , 1991 .

[16]  Wei-Yong Yan,et al.  Global convergence of Oja's subspace algorithm for principal component extraction , 1998, IEEE Trans. Neural Networks.

[17]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[18]  E. Oja,et al.  Principal component analysis by homogeneous neural networks, Part I : The weighted subspace criterion , 1992 .

[19]  P. Foldiak,et al.  Adaptive network for optimal linear feature extraction , 1989, International 1989 Joint Conference on Neural Networks.

[20]  Jean Pierre Delmas,et al.  Asymptotic Distributions Associated to Oja's Learning Equation for Neural Networks , 2022 .

[21]  Yingbo Hua,et al.  Fast subspace tracking and neural network learning by a novel information criterion , 1998, IEEE Trans. Signal Process..

[22]  Terence D. Sanger,et al.  Optimal unsupervised learning in a single-layer linear feedforward neural network , 1989, Neural Networks.

[23]  Sun-Yuan Kung,et al.  Principal Component Neural Networks: Theory and Applications , 1996 .