Preserving Local Structure in Gaussian Process Latent Variable Models

The Gaussian Process Latent Variable Model (GPLVM) is a non-linear variant of probabilistic Principal Components Analysis (PCA). The main advantage of the GPLVM over probabilistic PCA is that it can model non-linear transformations from the latent space to the data space. An important disadvantage of the GPLVM is its focus on preserving global data structure in the latent space, whereas preserving local data structure is generally considered to be more important in dimensionality reduction. In this paper, we present an extension of the GPLVM that encourages the preservation of local structure in the latent space. The extension entails the introduction of a prior distribution over the parameters of the GPLVM that measures the divergence between the pairwise distances in the data space and the latent space. We show that the proposed extension leads to strong results.

[1]  Christopher K. I. Williams Computing with Infinite Networks , 1996, NIPS.

[2]  Christopher M. Bishop,et al.  Bayesian PCA , 1998, NIPS.

[3]  Michael E. Tipping,et al.  Probabilistic Principal Component Analysis , 1999 .

[4]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[5]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[6]  Geoffrey E. Hinton,et al.  Global Coordination of Local Linear Models , 2001, NIPS.

[7]  Geoffrey E. Hinton,et al.  Neighbourhood Components Analysis , 2004, NIPS.

[8]  Neil D. Lawrence,et al.  Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models , 2005, J. Mach. Learn. Res..

[9]  Joaquin Quiñonero Candela,et al.  Local distance preservation in the GP-LVM through back constraints , 2006, ICML.

[10]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[11]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[12]  Trevor Darrell,et al.  Discriminative Gaussian process latent variable model for classification , 2007, ICML '07.

[13]  Michel Verleysen,et al.  Nonlinear Dimensionality Reduction , 2021, Computer Vision.

[14]  Geoffrey E. Hinton,et al.  Using Deep Belief Nets to Learn Covariance Kernels for Gaussian Processes , 2007, NIPS.

[15]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[16]  Iain Murray,et al.  Introduction to Gaussian Processes , 2008 .

[17]  Trevor Darrell,et al.  Rank priors for continuous non-linear dimensionality reduction , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[18]  Pascal Vincent,et al.  The Difficulty of Training Deep Architectures and the Effect of Unsupervised Pre-Training , 2009, AISTATS.