暂无分享,去创建一个
[1] Ivan Sosnovik,et al. PIE: Pseudo-Invertible Encoder , 2018, ArXiv.
[2] Eric Jones,et al. SciPy: Open Source Scientific Tools for Python , 2001 .
[3] Abhishek Kumar,et al. Regularized Autoencoders via Relaxed Injective Probability Flow , 2020, AISTATS.
[4] Shakir Mohamed,et al. Variational Inference with Normalizing Flows , 2015, ICML.
[5] Mariusz Bojarski,et al. Invertible Autoencoder for domain adaptation , 2018, Comput..
[6] Tong Che,et al. Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling , 2020, NeurIPS.
[7] E. Lorenz. Deterministic nonperiodic flow , 1963 .
[8] Lawrence Cayton,et al. Algorithms for manifold learning , 2005 .
[9] Jakob H. Macke,et al. Likelihood-free inference with emulator networks , 2018, AABI.
[10] Sepp Hochreiter,et al. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.
[11] Gilles Louppe,et al. Likelihood-free inference with an improved cross-entropy estimator , 2018, ArXiv.
[12] Peter Skands,et al. A brief introduction to PYTHIA 8.1 , 2007, Comput. Phys. Commun..
[13] Gaël Varoquaux,et al. The NumPy Array: A Structure for Efficient Numerical Computation , 2011, Computing in Science & Engineering.
[14] Eric Nalisnick,et al. Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..
[15] Gilles Louppe,et al. The frontier of simulation-based inference , 2020, Proceedings of the National Academy of Sciences.
[16] Iain Murray,et al. Neural Spline Flows , 2019, NeurIPS.
[17] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[18] Stefano Ermon,et al. Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models , 2017, AAAI.
[19] Gurtej Kanwar,et al. Normalizing Flows on Tori and Spheres , 2020, ICML.
[20] Luca Antiga,et al. Automatic differentiation in PyTorch , 2017 .
[21] Iain Murray,et al. Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows , 2018, AISTATS.
[22] Yaron Lipman,et al. Implicit Geometric Regularization for Learning Shapes , 2020, ICML.
[23] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[24] Yann LeCun,et al. Backpropagation for Implicit Spectral Densities , 2018, ArXiv.
[25] Prafulla Dhariwal,et al. Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.
[26] Samy Bengio,et al. Density estimation using Real NVP , 2016, ICLR.
[27] et al.,et al. Jupyter Notebooks - a publishing format for reproducible computational workflows , 2016, ELPUB.
[28] John D. Hunter,et al. Matplotlib: A 2D Graphics Environment , 2007, Computing in Science & Engineering.
[29] Gilles Louppe,et al. Mining gold from implicit models to improve likelihood-free inference , 2018, Proceedings of the National Academy of Sciences.
[30] Alain Trouvé,et al. Interpolating between Optimal Transport and MMD using Sinkhorn Divergences , 2018, AISTATS.
[31] Gurtej Kanwar,et al. Equivariant flow-based sampling for lattice gauge theory , 2020, Physical review letters.
[32] Yoshua Bengio,et al. NICE: Non-linear Independent Components Estimation , 2014, ICLR.
[33] Gilles Louppe,et al. Approximating Likelihood Ratios with Calibrated Discriminative Classifiers , 2015, 1506.02169.
[34] Bernhard Schölkopf,et al. From Variational to Deterministic Autoencoders , 2019, ICLR.
[35] J. Favereau,et al. DELPHES 3: a modular framework for fast simulation of a generic collider experiment , 2013, Journal of High Energy Physics.
[36] Timo Aila,et al. A Style-Based Generator Architecture for Generative Adversarial Networks , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[37] Mario Lucic,et al. Are GANs Created Equal? A Large-Scale Study , 2017, NeurIPS.
[38] Renjie Liao,et al. Latent Variable Modelling with Hyperbolic Normalizing Flows , 2020, ICML.
[39] Arthur Gretton,et al. KALE: When Energy-Based Learning Meets Adversarial Training , 2020, ArXiv.
[40] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[41] David Duvenaud,et al. Residual Flows for Invertible Generative Modeling , 2019, NeurIPS.
[42] Divakar Viswanath,et al. The fractal property of the Lorenz attractor , 2004 .
[43] Jaakko Lehtinen,et al. Analyzing and Improving the Image Quality of StyleGAN , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[44] Fu Jie Huang,et al. A Tutorial on Energy-Based Learning , 2006 .
[45] S. Liberty,et al. Linear Systems , 2010, Scientific Parallel Computing.
[46] K. Cranmer,et al. MadMiner: Machine Learning-Based Inference for Particle Physics , 2019, Computing and Software for Big Science.
[47] R. Frederix,et al. The automated computation of tree-level and next-to-leading order differential cross sections, and their matching to parton shower simulations , 2014, 1405.0301.
[48] Bernhard Schölkopf,et al. A Kernel Two-Sample Test , 2012, J. Mach. Learn. Res..