A Class of Robust Principal Component Vectors

This paper is concerned with a study of robust estimation in principal component analysis. A class of robust estimators which are characterized as eigenvectors of weighted sample covariance matrices is proposed, where the weight functions recursively depend on the eigenvectors themselves. Also, a feasible algorithm based on iterative reweighting of the covariance matrices is suggested for obtaining these estimators in practice. Statistical properties of the proposed estimators are investigated in terms of sensitivity to outliers and relative efficiency via their influence functions, which are derived with the help of Stein's lemma. We give a simple condition on the weight functions which ensures robustness of the estimators. The class includes, as a typical example, a method by the self-organizing rule in the neural computation. A numerical experiment is conducted to confirm a rapid convergence of the suggested algorithm.