We propose a constrained EM algorithm for principal component analysis (PCA) using a coupled probability model derived from single-standard factor analysis models with isotropic noise structure. The single probabilistic PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization of symmetric matrices. By contrast, the proposed algorithm finds the actual principal components, which are sorted in descending order of eigenvalue size and require no additional calculation or postprocessing. The method is easily applied to kernel PCA. It is also shown that the new EM algorithm is derived from a generalized least-squares formulation.
[1]
Seungjin Choi,et al.
Independence or non-negativity: constraints for sensory coding
,
2004
.
[2]
D. Rubin,et al.
Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper
,
1977
.
[3]
Roman Rosipal,et al.
An Expectation-Maximization Approach to Nonlinear Component Analysis
,
2001,
Neural Computation.
[4]
Christopher M. Bishop,et al.
Mixtures of Probabilistic Principal Component Analyzers
,
1999,
Neural Computation.
[5]
I. Jolliffe.
Principal Component Analysis
,
2002
.
[6]
Sam T. Roweis,et al.
EM Algorithms for PCA and SPCA
,
1997,
NIPS.