暂无分享,去创建一个
[1] Peter M. Roth,et al. The Quest for the Golden Activation Function , 2018, ArXiv.
[2] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.
[3] Stephen Marshall,et al. Activation Functions: Comparison of trends in Practice and Research for Deep Learning , 2018, ArXiv.
[4] Quoc V. Le,et al. Evolving Normalization-Activation Layers , 2020, NeurIPS.
[5] Sepp Hochreiter,et al. Self-Normalizing Neural Networks , 2017, NIPS.
[6] Kevin Gimpel,et al. Gaussian Error Linear Units (GELUs) , 2016 .
[7] Nitish Srivastava,et al. Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.
[8] Nikos Komodakis,et al. Wide Residual Networks , 2016, BMVC.
[9] Douglas Thain,et al. Distributed computing in practice: the Condor experience , 2005, Concurr. Pract. Exp..
[10] Risto Miikkulainen,et al. Evolutionary optimization of deep learning activation functions , 2020, GECCO.
[11] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[12] Diganta Misra. Mish: A Self Regularized Non-Monotonic Activation Function , 2020, BMVC.
[13] Risto Miikkulainen,et al. Evolving Loss Functions with Multivariate Taylor Polynomial Parameterizations , 2020, ArXiv.
[14] Frank Hutter,et al. Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..
[15] Sepp Hochreiter,et al. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) , 2015, ICLR.
[16] Risto Miikkulainen,et al. Population-Based Training for Loss Function Optimization , 2020, ArXiv.
[17] Jian Sun,et al. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[18] Quoc V. Le,et al. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks , 2019, ICML.
[19] Martin Wistuba,et al. A Survey on Neural Architecture Search , 2019, ArXiv.
[20] Thomas Brox,et al. Striving for Simplicity: The All Convolutional Net , 2014, ICLR.
[21] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[22] Yuan Yu,et al. TensorFlow: A system for large-scale machine learning , 2016, OSDI.
[23] Andrew L. Maas. Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .
[24] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[25] Jian Sun,et al. Identity Mappings in Deep Residual Networks , 2016, ECCV.
[26] Risto Miikkulainen,et al. Active Guidance for a Finless Rocket Using Neuroevolution , 2003, GECCO.
[27] Quoc V. Le,et al. Searching for Activation Functions , 2018, arXiv.
[28] Alok Aggarwal,et al. Regularized Evolution for Image Classifier Architecture Search , 2018, AAAI.
[29] Risto Miikkulainen,et al. Improved Training Speed, Accuracy, and Data Utilization Through Loss Function Optimization , 2019, 2020 IEEE Congress on Evolutionary Computation (CEC).
[30] L. Darrell Whitley,et al. Delta Coding: An Iterative Search Strategy for Genetic Algorithms , 1991, ICGA.
[31] Kenji Doya,et al. Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning , 2017, Neural Networks.
[32] Li Fei-Fei,et al. ImageNet: A large-scale hierarchical image database , 2009, CVPR.
[33] Diganta Misra,et al. Mish: A Self Regularized Non-Monotonic Neural Activation Function , 2019, ArXiv.
[34] John R. Koza,et al. Genetic programming - on the programming of computers by means of natural selection , 1993, Complex adaptive systems.
[35] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .