A prior-based approximate latent Riemannian metric

Stochastic generative models enable us to capture the geometric structure of a data manifold lying in a high dimensional space through a Riemannian metric in the latent space. However, its practical use is rather limited mainly due to inevitable complexity. In this work we propose a surrogate conformal Riemannian metric in the latent space of a generative model that is simple, efficient and robust. This metric is based on a learnable prior that we propose to learn using a basic energy-based model. We theoretically analyze the behavior of the proposed metric and show that it is sensible to use in practice. We demonstrate experimentally the efficiency and robustness, as well as the behavior of the new approximate metric. Also, we show the applicability of the proposed methodology for data analysis in the life sciences.

[1]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[2]  Francisco J. R. Ruiz,et al.  Unbiased Implicit Variational Inference , 2018, AISTATS.

[3]  Neil D. Lawrence,et al.  Metrics for Probabilistic Geometries , 2014, UAI.

[4]  Guy Lebanon,et al.  Learning Riemannian Metrics , 2002, UAI.

[5]  Daan Wierstra,et al.  Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.

[6]  Neil D. Lawrence,et al.  Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models , 2005, J. Mach. Learn. Res..

[7]  Søren Hauberg,et al.  Fast and Robust Shortest Paths on Manifolds Learned from Data , 2019, AISTATS.

[8]  Lars Kai Hansen,et al.  Maximum Likelihood Estimation of Riemannian Metrics from Euclidean Data , 2017, GSI.

[9]  Michael Figurnov,et al.  Monte Carlo Gradient Estimation in Machine Learning , 2019, J. Mach. Learn. Res..

[10]  Dongmei Fu,et al.  Geodesic Clustering in Deep Generative Models , 2018, ArXiv.

[11]  Patrick van der Smagt,et al.  Learning Hierarchical Priors in VAEs , 2019, NeurIPS.

[12]  Tian Han,et al.  Learning Latent Space Energy-Based Prior Model , 2020, NeurIPS.

[13]  John J. Irwin,et al.  ZINC 15 – Ligand Discovery for Everyone , 2015, J. Chem. Inf. Model..

[14]  Søren Hauberg,et al.  Only Bayes should learn a manifold , 2019 .

[15]  Xavier Pennec,et al.  Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements , 2006, Journal of Mathematical Imaging and Vision.

[16]  Lars Kai Hansen,et al.  Latent Space Oddity: on the Curvature of Deep Generative Models , 2017, ICLR.

[17]  Ruslan Salakhutdinov,et al.  Importance Weighted Autoencoders , 2015, ICLR.

[18]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[19]  Hiroshi Takahashi,et al.  Variational Autoencoder with Implicit Optimal Priors , 2018, AAAI.

[20]  Bernhard Schölkopf,et al.  Geometrically Enriched Latent Spaces , 2020, AISTATS.

[21]  Jan Kautz,et al.  NCP-VAE: Variational Autoencoders with Noise Contrastive Priors , 2020, ArXiv.

[22]  D. Pfau,et al.  Disentangling by Subspace Diffusion , 2020, NeurIPS.

[23]  Fu Jie Huang,et al.  A Tutorial on Energy-Based Learning , 2006 .

[24]  Søren Hauberg,et al.  What is a meaningful representation of protein sequences? , 2020, ArXiv.

[25]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[26]  Søren Hauberg,et al.  A Geometric take on Metric Learning , 2012, NIPS.

[27]  Soren Hauberg,et al.  Expected path length on random manifolds , 2019, ArXiv.

[28]  Søren Hauberg,et al.  Probabilistic Solutions to Differential Equations and their Application to Riemannian Statistics , 2013, AISTATS.

[29]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[30]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[31]  Lars Kai Hansen,et al.  A Locally Adaptive Normal Distribution , 2016, NIPS.

[32]  Andriy Mnih,et al.  Resampled Priors for Variational Autoencoders , 2018, AISTATS.

[33]  Max Welling,et al.  VAE with a VampPrior , 2017, AISTATS.