Neural Permutation Processes

We introduce a neural architecture to perform amortized approximate Bayesian inference over latent random permutations of two sets of objects. The method involves approximating permanents of matrices of pairwise probabilities using recent ideas on functions defined over sets. Each sampled permutation comes with a probability estimate, a quantity unavailable in MCMC approaches. We illustrate the method in sets of 2D points and MNIST images.

[1]  Leslie G. Valiant,et al.  The Complexity of Computing the Permanent , 1979, Theor. Comput. Sci..

[2]  Persi Diaconis,et al.  The Markov chain Monte Carlo revolution , 2008 .

[3]  Noah D. Goodman,et al.  Learning Stochastic Inverses , 2013, NIPS.

[4]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[5]  Noah D. Goodman,et al.  Amortized Inference in Probabilistic Reasoning , 2014, CogSci.

[6]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[7]  Noah D. Goodman,et al.  Deep Amortized Inference for Probabilistic Programs , 2016, ArXiv.

[8]  Frank D. Wood,et al.  Inference Networks for Sequential Monte Carlo in Graphical Models , 2016, ICML.

[9]  Frank D. Wood,et al.  Inference Compilation and Universal Probabilistic Programming , 2016, AISTATS.

[10]  Liam Paninski,et al.  Neural Networks for Efficient Bayesian Decoding of Natural Images from Retinal Neurons , 2017, bioRxiv.

[11]  Scott W. Linderman,et al.  Learning Latent Permutations with Gumbel-Sinkhorn Networks , 2018, ICLR.

[12]  Liam Paninski,et al.  Amortized Bayesian inference for clustering models , 2018, ArXiv.

[13]  Scott W. Linderman,et al.  Reparameterizing the Birkhoff Polytope for Variational Permutation Inference , 2017, AISTATS.

[14]  Liam Paninski,et al.  Scalable approximate Bayesian inference for particle tracking data , 2018, ICML.

[15]  Jieping Ye,et al.  PINE: Universal Deep Embedding for Graph Nodes via Partial Permutation Invariant Set Functions , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.