Parametric dimensionality reduction by unsupervised regression

We introduce a parametric version (pDRUR) of the recently proposed Dimensionality Reduction by Unsupervised Regression algorithm. pDRUR alternately minimizes reconstruction error by fitting parametric functions given latent coordinates and data, and by updating latent coordinates given functions (with a Gauss-Newton method decoupled over coordinates). Both the fit and the update become much faster while attaining results of similar quality, and afford dealing with far larger datasets (105 points). We show in a number of benchmarks how the algorithm efficiently learns good latent coordinates and bidirectional mappings between the data and latent space, even with very noisy or low-quality initializations, often drastically improving the result of spectral and other methods.

[1]  Helge J. Ritter,et al.  Principal surfaces from unsupervised kernel regression , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  R. Tibshirani,et al.  Adaptive Principal Surfaces , 1994 .

[3]  T. Hastie,et al.  Principal Curves , 2007 .

[4]  Shufeng Tan,et al.  Reducing data dimensionality through optimizing neural network inputs , 1995 .

[5]  Miguel Á. Carreira-Perpiñán,et al.  The Laplacian Eigenmaps Latent Variable Model , 2007, AISTATS.

[6]  P. Whittle On principal components and least square methods of factor analysis , 2012 .

[7]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[8]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..

[9]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[10]  G. Young Maximum likelihood estimation and factor analysis , 1941 .

[11]  Miguel Á. Carreira-Perpiñán,et al.  Dimensionality reduction by unsupervised regression , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[13]  Unsupervised Kernel Regression for Nonlinear Dimensionality Reduction , 2022 .

[14]  Neil D. Lawrence,et al.  Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models , 2005, J. Mach. Learn. Res..

[15]  Ali Rahimi Learning to Transform Time Series with a Few Examples , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  R. P. McDonald,et al.  A general approach to nonlinear factor analysis , 1962 .

[17]  Marc'Aurelio Ranzato,et al.  Sparse Feature Learning for Deep Belief Networks , 2007, NIPS.

[18]  R. C. Williamson,et al.  Regularized principal manifolds , 2001 .

[19]  Marc'Aurelio Ranzato,et al.  Efficient Learning of Sparse Representations with an Energy-Based Model , 2006, NIPS.