Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks
暂无分享,去创建一个
Gordon Pipa | Holger Finger | Leon René Sütfeld | Flemming Brieger | Sonja Füllhase | G. Pipa | L. R. Sütfeld | Flemming Brieger | Holger Finger | S. Füllhase
[1] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[2] Sepp Hochreiter,et al. Self-Normalizing Neural Networks , 2017, NIPS.
[3] Tianqi Chen,et al. Empirical Evaluation of Rectified Activations in Convolutional Network , 2015, ArXiv.
[4] S. Nelson,et al. Homeostatic plasticity in the developing nervous system , 2004, Nature Reviews Neuroscience.
[5] Yoshua Bengio,et al. Maxout Networks , 2013, ICML.
[6] Jian Sun,et al. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[7] Yoshua Bengio,et al. Deep Sparse Rectifier Neural Networks , 2011, AISTATS.
[8] Eric Alcaide,et al. E-swish: Adjusting Activations to Different Network Depths , 2018, ArXiv.
[9] Richard Hans Robert Hahnloser,et al. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit , 2000, Nature.
[10] Michael S. Gashler,et al. A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks , 2015, 2015 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K).
[11] Gordon Pipa,et al. A Unifying Framework of Synaptic and Intrinsic Plasticity in Neural Populations , 2018, Neural Computation.
[12] Sepp Hochreiter,et al. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) , 2015, ICLR.
[13] Xiang Li,et al. Understanding the Disharmony Between Dropout and Batch Normalization by Variance Shift , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[14] Quoc V. Le,et al. Searching for Activation Functions , 2018, arXiv.
[15] Alessandro Rozza,et al. Learning Combinations of Activation Functions , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).
[16] Pierre Baldi,et al. Learning Activation Functions to Improve Deep Neural Networks , 2014, ICLR.
[17] Andrew L. Maas. Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .
[18] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[19] Raymond W. Ptucha,et al. Adaptive Activation Functions for Deep Networks , 2016, Computational Imaging.
[20] Han Liu,et al. Nonparametrically Learning Activation Functions in Deep Neural Nets , 2016 .
[21] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[22] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[23] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[24] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.