A novel activation function for multilayer feed-forward neural networks
暂无分享,去创建一个
[1] Andrew L. Maas. Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .
[2] Jürgen Schmidhuber,et al. Compete to Compute , 2013, NIPS.
[3] Razvan Pascanu,et al. Theano: new features and speed improvements , 2012, ArXiv.
[4] Andrew Zisserman,et al. Return of the Devil in the Details: Delving Deep into Convolutional Nets , 2014, BMVC.
[5] Benjamin Graham,et al. Spatially-sparse convolutional neural networks , 2014, ArXiv.
[6] Patrice Y. Simard,et al. Best practices for convolutional neural networks applied to visual document analysis , 2003, Seventh International Conference on Document Analysis and Recognition, 2003. Proceedings..
[7] Simon Haykin,et al. GradientBased Learning Applied to Document Recognition , 2001 .
[8] Yann LeCun,et al. What is the best multi-stage architecture for object recognition? , 2009, 2009 IEEE 12th International Conference on Computer Vision.
[9] D. Mandic,et al. Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models , 2009 .
[10] Yoshua Bengio,et al. Maxout Networks , 2013, ICML.
[11] A. V. Olgac,et al. Performance Analysis of Various Activation Functions in Generalized MLP Architectures of Neural Networks , 2011 .
[12] David G. Stork,et al. Pattern Classification , 1973 .
[13] Klaus-Robert Müller,et al. Efficient BackProp , 2012, Neural Networks: Tricks of the Trade.
[14] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[15] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[16] Geoffrey E. Hinton,et al. Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[17] Cordelia Schmid,et al. Convolutional Kernel Networks , 2014, NIPS.
[18] James Martens,et al. Deep learning via Hessian-free optimization , 2010, ICML.
[19] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[20] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[21] Zhuowen Tu,et al. Deeply-Supervised Nets , 2014, AISTATS.
[22] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[23] Yoshua Bengio,et al. Deep Sparse Rectifier Neural Networks , 2011, AISTATS.
[24] Rob Fergus,et al. Stochastic Pooling for Regularization of Deep Convolutional Neural Networks , 2013, ICLR.
[25] Danilo P. Mandic,et al. Complex Valued Nonlinear Adaptive Filters , 2009 .
[26] Andrew Y. Ng,et al. Reading Digits in Natural Images with Unsupervised Feature Learning , 2011 .
[27] David G. Stork,et al. Pattern Classification (2nd ed.) , 1999 .
[28] Razvan Pascanu,et al. Theano: A CPU and GPU Math Compiler in Python , 2010, SciPy.
[29] Nitish Srivastava,et al. Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.
[30] Bo Pang,et al. Seeing Stars: Exploiting Class Relationships for Sentiment Categorization with Respect to Rating Scales , 2005, ACL.
[31] Nitish Srivastava,et al. Improving Neural Networks with Dropout , 2013 .