We present a unified optimization approach to adaptively estimating the signal subspace of a covariance matrix. Two optimization criteria, namely the mean-square-error (MSE) minimization and the output variance maximization for the signal subspace estimation are shown to be equivalent, and thus yield the same algorithm when applied with a certain optimization technique. Our analysis shows that both the subspace algorithm with explicit weight orthonormalization and the unconstrained gradient descent algorithm to minimize the MSE can be approximated reasonably to the well-known Oja's subspace algorithm. The global convergence of Oja's subspace algorithm is established based on the Lyapunov function approach. Simulations are provided to confirm the analysis.
[1]
John B. Moore,et al.
Global analysis of Oja's flow for neural networks
,
1994,
IEEE Trans. Neural Networks.
[2]
Kurt Hornik,et al.
Convergence analysis of local feature extraction algorithms
,
1992,
Neural Networks.
[3]
Erkki Oja,et al.
Neural Networks, Principal Components, and Subspaces
,
1989,
Int. J. Neural Syst..
[4]
G. Golub,et al.
Tracking a few extreme singular values and vectors in signal processing
,
1990,
Proc. IEEE.
[5]
S.Y. Kung,et al.
Adaptive Principal component EXtraction (APEX) and applications
,
1994,
IEEE Trans. Signal Process..