End-to-End Sinkhorn Autoencoder With Noise Generator
暂无分享,去创建一个
Kamil Deja | Jan Dubiński | Piotr Nowak | Sandro Wenzel | Przemysław Spurek | Tomasz Trzcinski | T. Trzciński | K. Deja | S. Wenzel | P. Spurek | Jan Dubiński | Piotr W. Nowak | T. Trzciński | J. Dubinski
[1] Cordelia Schmid,et al. How good is my GAN? , 2018, ECCV.
[2] Gabriel Peyré,et al. Sample Complexity of Sinkhorn Divergences , 2018, AISTATS.
[3] Babajide O Ayinde,et al. Regularizing Deep Neural Networks by Enhancing Diversity in Feature Extraction , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[4] Yoshua Bengio,et al. Deep Sparse Rectifier Neural Networks , 2011, AISTATS.
[5] Xiaogang Wang,et al. Deep Learning Face Attributes in the Wild , 2014, 2015 IEEE International Conference on Computer Vision (ICCV).
[6] Honglak Lee,et al. Learning Structured Output Representation using Deep Conditional Generative Models , 2015, NIPS.
[7] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[8] Jacek Tabor,et al. Cramer-Wold Auto-Encoder , 2020, J. Mach. Learn. Res..
[9] Gabriel Peyré,et al. Sinkhorn-AutoDiff: Tractable Wasserstein Learning of Generative Models , 2017 .
[10] Robert Andritschke,et al. MEGAlib – The Medium Energy Gamma-ray Astronomy Library , 2006 .
[11] Marco Cuturi,et al. Sinkhorn Distances: Lightspeed Computation of Optimal Transport , 2013, NIPS.
[12] Prafulla Dhariwal,et al. Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.
[13] D Sakata,et al. Geant4‐DNA example applications for track structure simulations in liquid water: A report from the Geant4‐DNA Project , 2018, Medical physics.
[14] C. J. Oskamp,et al. ALICE technical design report of the zero degree calorimeter (ZDC) , 1999 .
[15] Roderick Murray-Smith,et al. Variational Sparse Coding , 2019, UAI.
[16] Bernhard Schölkopf,et al. Wasserstein Auto-Encoders , 2017, ICLR.
[17] Michela Paganini,et al. CaloGAN: Simulating 3D High Energy Particle Showers in Multi-Layer Electromagnetic Calorimeters with Generative Adversarial Networks , 2017, ArXiv.
[18] T. Trzciński,et al. Generative Models for Fast Cluster Simulations in the TPC for the ALICE Experiment , 2018, Advances in Intelligent Systems and Computing.
[19] Gabriel Peyré,et al. Learning Generative Models with Sinkhorn Divergences , 2017, AISTATS.
[20] Haibin Ling,et al. Diffusion Distance for Histogram Comparison , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).
[21] Gustavo K. Rohde,et al. Sliced-Wasserstein Autoencoder: An Embarrassingly Simple Generative Model , 2018, ArXiv.
[22] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[23] Alexander M. Rush,et al. Latent Normalizing Flows for Discrete Sequences , 2019, ICML.
[24] Gianluca Cerminara,et al. Anomaly detection using Deep Autoencoders for the assessment of the quality of the data acquired by the CMS experiment , 2019, EPJ Web of Conferences.
[25] Frank Nielsen,et al. Sinkhorn AutoEncoders , 2018, UAI.
[26] Matthias Rudolph Richter,et al. Upgrade of the ALICE Experiment Letter Of Intent , 2014 .
[27] Takehisa Yairi,et al. Anomaly Detection Using Autoencoders with Nonlinear Dimensionality Reduction , 2014, MLSDA'14.
[28] Léon Bottou,et al. Wasserstein GAN , 2017, ArXiv.
[29] Federico Carminati,et al. Three Dimensional Energy Parametrized Generative Adversarial Networks for Electromagnetic Shower Simulation , 2018, 2018 25th IEEE International Conference on Image Processing (ICIP).
[30] Stanislav Pidhorskyi,et al. Adversarial Latent Autoencoders , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[31] Bolei Zhou,et al. Disentangled Inference for GANs with Latently Invertible Autoencoder , 2019 .
[32] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[33] Max Welling,et al. VAE with a VampPrior , 2017, AISTATS.
[34] Arthur Gretton,et al. Demystifying MMD GANs , 2018, ICLR.
[35] David P. Wipf,et al. Diagnosing and Enhancing VAE Models , 2019, ICLR.
[36] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[37] Alain Trouvé,et al. Interpolating between Optimal Transport and MMD using Sinkhorn Divergences , 2018, AISTATS.
[38] Yi Zhang,et al. Do GANs actually learn the distribution? An empirical study , 2017, ArXiv.
[39] Hongxun Yao,et al. Auto-encoder based dimensionality reduction , 2016, Neurocomputing.
[40] Yali Amit,et al. Generative Latent Flow , 2019 .
[41] Pascal Vincent,et al. Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..
[42] Vincent Breton,et al. GATE (geant4 application for tomographic emission): a PET/SPECT general-purpose simulation platform , 2003 .
[43] David Lopez-Paz,et al. Optimizing the Latent Space of Generative Networks , 2017, ICML.
[44] Soumith Chintala,et al. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.
[45] VincentPascal,et al. Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010 .
[46] Yingyu Liang,et al. Generalization and Equilibrium in Generative Adversarial Nets (GANs) , 2017, ICML.
[47] Bernhard Schölkopf,et al. A Kernel Two-Sample Test , 2012, J. Mach. Learn. Res..
[48] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.
[49] Sepp Hochreiter,et al. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.