Fast algorithms for minor component analysis

In this paper, we propose new adaptive algorithms for the extraction and tracking of the least (minor) eigenvectors of a positive Hermitian covariance matrix. The proposed algorithms are said fast in the sense that their computational cost is of order O(np) flops per iteration where n is the size of the observation vector and p<n is the number of minor eigenvectors we need to estimate. Two classes of algorithms are considered : namely the PASTd (projection approximation subspace tracking with deflation) that is derived using projection approximation in conjunction with power iteration and the Oja that uses stochastic gradient technique. Using appropriate fast orthogonalization techniques we introduce for each class new fast algorithms that extract the minor eigenvectors and guarantee the orthogonality of the weight matrix at each iteration