暂无分享,去创建一个
Jascha Sohl-Dickstein | Samuel S. Schoenholz | Jeffrey Pennington | Lechao Xiao | Yasaman Bahri | J. Sohl-Dickstein | Jeffrey Pennington | Yasaman Bahri | S. Schoenholz | Lechao Xiao | Jascha Narain Sohl-Dickstein
[1] J. Kautsky,et al. A Matrix Approach to Discrete Wavelets , 1994 .
[2] Fei-Fei Li,et al. ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.
[3] Jason Weston,et al. Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..
[4] Tara N. Sainath,et al. Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.
[5] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[6] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[7] Phil Blunsom,et al. A Convolutional Neural Network for Modelling Sentences , 2014, ACL.
[8] Surya Ganguli,et al. Exact solutions to the nonlinear dynamics of learning in deep linear neural networks , 2013, ICLR.
[9] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.
[10] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[11] Surya Ganguli,et al. Exponential expressivity in deep neural networks through transient chaos , 2016, NIPS.
[12] Jian Sun,et al. Identity Mappings in Deep Residual Networks , 2016, ECCV.
[13] Demis Hassabis,et al. Mastering the game of Go with deep neural networks and tree search , 2016, Nature.
[14] Jiri Matas,et al. All you need is a good init , 2015, ICLR.
[15] Yoram Singer,et al. Toward Deeper Understanding of Neural Networks: The Power of Initialization and a Dual View on Expressivity , 2016, NIPS.
[16] Surya Ganguli,et al. Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice , 2017, NIPS.
[17] Jascha Sohl-Dickstein,et al. A Correspondence Between Random Neural Networks and Statistical Field Theory , 2017, ArXiv.
[18] Demis Hassabis,et al. Mastering the game of Go without human knowledge , 2017, Nature.
[19] Surya Ganguli,et al. The Emergence of Spectral Universality in Deep Networks , 2018, AISTATS.
[20] Arnaud Doucet,et al. On the Selection of Initialization and Activation Function for Deep Neural Networks , 2018, ArXiv.
[21] Samuel S. Schoenholz,et al. Deep Mean Field Theory: Layerwise Variance and Width Variation as Methods to Control Gradient Explosion , 2018, ICLR 2018.
[22] David Rolnick,et al. How to Start Training: The Effect of Initialization and Architecture , 2018, NeurIPS.
[23] Shun-ichi Amari,et al. Universal statistics of Fisher information in deep neural networks: mean field approach , 2018, AISTATS.