A unified convergence analysis of Normalized PAST algorithms for estimating principal and minor components

We present a unified convergence analysis, based on a deterministic discrete time (DDT) approach, of the normalized projection approximation subspace tracking (Normalized PAST) algorithms for estimating principal and minor components of an input signal. The proposed analysis shows that the DDT system of the Normalized PAST algorithm (for PCA/MCA), with any forgetting factor in a certain range, converges to a desired eigenvector. This eigenvector is completely characterized as the normalized version of the orthogonal projection of the initial estimate onto the eigensubspace corresponding to the largest/smallest eigenvalue of the autocorrelation matrix of the input signal. This characterization holds in general case where the eigenvalues are not necessarily distinct. Numerical examples show that the proposed analysis demonstrates very well the convergence behavior of the Normalized PAST algorithms which uses a rank-1 instantaneous approximation of the autocorrelation matrix.

[1]  Pedro J. Zufiria,et al.  On the discrete-time dynamics of the basic Hebbian neural network node , 2002, IEEE Trans. Neural Networks.

[2]  H. Sakai,et al.  Two Improved Algorithms for Adaptive Subspace Filtering , 1997 .

[3]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[4]  Karim Abed-Meraim,et al.  A New Look at the Power Method for Fast Subspace Tracking , 1999, Digit. Signal Process..

[5]  Ian T. Jolliffe,et al.  Principal Component Analysis , 2002, International Encyclopedia of Statistical Science.

[6]  Isao Yamada,et al.  An adaptive extraction of generalized eigensubspace by using exact nested orthogonal complement structure , 2013, Multidimens. Syst. Signal Process..

[7]  Zhang Yi,et al.  A Globally Convergent MC Algorithm With an Adaptive Learning Rate , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Chongzhao Han,et al.  On the Discrete-Time Dynamics of a Class of Self-Stabilizing MCA Extraction Algorithms , 2010, IEEE Transactions on Neural Networks.

[9]  Zhang Yi,et al.  A Modified Oja–Xu MCA Learning Algorithm and Its Convergence Analysis , 2007, IEEE Transactions on Circuits and Systems II: Express Briefs.

[10]  Qingfu Zhang,et al.  On the discrete-time dynamics of a PCA learning algorithm , 2003, Neurocomputing.

[11]  Ali H. Sayed,et al.  Fundamentals Of Adaptive Filtering , 2003 .

[12]  Chongzhao Han,et al.  Modified gradient algorithm for total least square filtering , 2006, Neurocomputing.

[13]  Bin Yang,et al.  Projection approximation subspace tracking , 1995, IEEE Trans. Signal Process..

[14]  Dezhong Peng,et al.  Convergence analysis of a deterministic discrete time system of feng's MCA learning algorithm , 2006, IEEE Trans. Signal Process..

[15]  Isao Yamada,et al.  An Efficient Adaptive Minor Subspace Extraction Using Exact Nested Orthogonal Complement Structure , 2008, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[16]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[17]  Yingbo Hua Asymptotical orthonormalization of subspace matrices without square root , 2004, IEEE Signal Processing Magazine.

[18]  R. O. Schmidt,et al.  Multiple emitter location and signal Parameter estimation , 1986 .

[19]  Karim Abed-Meraim,et al.  On a Class of Orthonormal Algorithms for Principal and Minor Subspace Tracking , 2002, J. VLSI Signal Process..

[20]  Hideaki Sakai,et al.  A new adaptive algorithm for minor component analysis , 1998, Signal Process..

[21]  Jinkuan Wang,et al.  A neural minor component analysis algorithm for robust beamforming , 2005, IEEE International Symposium on Communications and Information Technology, 2005. ISCIT 2005..

[22]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[23]  Jianbin Gao,et al.  A Globally Convergent MCA Algorithm by Generalized Eigen-Decomposition , 2011 .

[24]  Wei Xing Zheng,et al.  An Approximate Inverse-Power Algorithm for Adaptive Extraction of Minor Subspace , 2007, IEEE Transactions on Signal Processing.

[25]  Zheng Bao,et al.  Total least mean squares algorithm , 1998, IEEE Trans. Signal Process..

[26]  Y. Hua,et al.  Orthogonal Oja algorithm , 2000, IEEE Signal Processing Letters.

[27]  Fa-Long Luo,et al.  A Minor Component Analysis Algorithm , 1997, Neural Networks.

[28]  Mao Ye,et al.  Convergence analysis of a deterministic discrete time system of feng's MCA learning algorithm , 2005, IEEE Transactions on Signal Processing.

[29]  Erkki Oja,et al.  Modified Hebbian learning for curve and surface fitting , 1992, Neural Networks.

[30]  Tan Lee,et al.  Robust adaptive quasi-Newton algorithms for eigensubspace estimation , 2003 .

[31]  Bin Yang,et al.  Asymptotic convergence analysis of the projection approximation subspace tracking algorithms , 1996, Signal Process..

[32]  A. Mohammed,et al.  Effect of normalization of eigenvectors on the past and RP algorithms for PCA , 2005, 2005 13th European Signal Processing Conference.

[33]  K. Abed-Meraim,et al.  Natural power method for fast subspace tracking , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[34]  D. Luenberger Optimization by Vector Space Methods , 1968 .