Compositional Visual Generation with Energy Based Models
暂无分享,去创建一个
Shuang Li | Igor Mordatch | Yilun Du | Igor Mordatch | Yilun Du | Shuang Li
[1] Razvan Pascanu,et al. Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.
[2] Ernest Lepore,et al. The compositionality papers , 2002 .
[3] Geoffrey E. Hinton. Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.
[4] Klaus Greff,et al. Multi-Object Representation Learning with Iterative Variational Inference , 2019, ICML.
[5] Fu Jie Huang,et al. A Tutorial on Energy-Based Learning , 2006 .
[6] Yoshua Bengio,et al. Deep Directed Generative Models with Energy-Based Probability Estimation , 2016, ArXiv.
[7] Christopher Burgess,et al. beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework , 2016, ICLR 2016.
[8] Joshua B. Tenenbaum,et al. Deep Convolutional Inverse Graphics Network , 2015, NIPS.
[9] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[10] Igor Mordatch,et al. Implicit Generation and Generalization with Energy Based Models , 2018 .
[11] Igor Mordatch,et al. Model Based Planning with Energy Based Models , 2019, CoRL.
[12] Razvan Pascanu,et al. Progressive Neural Networks , 2016, ArXiv.
[13] Kevin Murphy,et al. Generative Models of Visually Grounded Imagination , 2017, ICLR.
[14] Koray Kavukcuoglu,et al. Neural scene representation and rendering , 2018, Science.
[15] Yee Whye Teh,et al. Bayesian Learning via Stochastic Gradient Langevin Dynamics , 2011, ICML.
[16] Zhijian Ou,et al. Learning Neural Random Fields with Inclusive Auxiliary Generators , 2018, ArXiv.
[17] Derek Hoiem,et al. Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[18] Thomas Paine,et al. Few-shot Autoregressive Density Estimation: Towards Learning to Learn Distributions , 2017, ICLR.
[19] Soumith Chintala,et al. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.
[20] Quoc V. Le,et al. Searching for Activation Functions , 2018, arXiv.
[21] Geoffrey E. Hinton,et al. Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer , 2017, ICLR.
[22] Geoffrey E. Hinton,et al. Learning nonlinear constraints with contrastive backpropagation , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..
[23] Stefan Wermter,et al. Continual Lifelong Learning with Neural Networks: A Review , 2019, Neural Networks.
[24] Alex Graves,et al. DRAW: A Recurrent Neural Network For Image Generation , 2015, ICML.
[25] Yuval Tassa,et al. MuJoCo: A physics engine for model-based control , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[26] Max Welling Donald,et al. Products of Experts , 2007 .
[27] Jacob Andreas,et al. Measuring Compositionality in Representation Learning , 2019, ICLR.
[28] Yang Lu,et al. A Theory of Generative ConvNet , 2016, ICML.
[29] Joshua B. Tenenbaum,et al. Building machines that learn and think like people , 2016, Behavioral and Brain Sciences.
[30] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[31] Sjoerd van Steenkiste,et al. A Case for Object Compositionality in Deep Generative Models of Images , 2018, ArXiv.
[32] Murray Shanahan,et al. SCAN: Learning Hierarchical Compositional Visual Concepts , 2017, ICLR.