暂无分享,去创建一个
[1] Ohad Shamir,et al. The Power of Depth for Feedforward Neural Networks , 2015, COLT.
[2] Kurt Hornik,et al. Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.
[3] Yoshua Bengio,et al. Shallow vs. Deep Sum-Product Networks , 2011, NIPS.
[4] Matus Telgarsky,et al. Benefits of Depth in Neural Networks , 2016, COLT.
[5] Nico M. Temme,et al. Numerical methods for special functions , 2007 .
[6] Yoshua Bengio,et al. Maxout Networks , 2013, ICML.
[7] Yann LeCun,et al. Regularization of Neural Networks using DropConnect , 2013, ICML.
[8] George Cybenko,et al. Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..
[9] C. Chui,et al. Approximation by ridge functions and neural networks with one hidden layer , 1992 .
[10] Andrew R. Barron,et al. Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.
[11] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[12] Yoshua. Bengio,et al. Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..
[13] Alexandr Andoni,et al. Learning Polynomials with Neural Networks , 2014, ICML.
[14] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.