An analysis of the impact of subsampling on the neural network error surface
暂无分享,去创建一个
Beatrice M. Ombuki-Berman | Andries Engelbrecht | Cody Dennis | A. Engelbrecht | B. Ombuki-Berman | Cody Dennis
[1] M. Hasenjäger,et al. Active learning in neural networks , 2002 .
[2] Shu-Tao Xia,et al. Back-propagation neural network on Markov chains from system call sequences: a new approach for detecting Android malware with system call sequences , 2017, IET Inf. Secur..
[3] Andries Petrus Engelbrecht,et al. Training feedforward neural networks with dynamic particle swarm optimisation , 2012, Swarm Intelligence.
[4] Phlippie Rudolph Bosman. The influence of fitness landscape characteristics on the search behaviour of particle swarm optimisers , 2019 .
[5] Pascal Kerschke,et al. Comprehensive Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems Using the R-Package Flacco , 2017, Studies in Classification, Data Analysis, and Knowledge Organization.
[6] Tingting Tang,et al. The Loss Surface of Deep Linear Networks Viewed Through the Algebraic Geometry Lens , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[7] Bernd Bischl,et al. Exploratory landscape analysis , 2011, GECCO '11.
[8] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[9] Sébastien Destercke,et al. Epistemic Uncertainty Sampling , 2019, DS.
[10] Derong Liu,et al. A new learning algorithm for feedforward neural networks , 2001, Proceeding of the 2001 IEEE International Symposium on Intelligent Control (ISIC '01) (Cat. No.01CH37206).
[11] Anna Sergeevna Bosman. Fitness Landscape Analysis of Feed-Forward Neural Networks , 2019 .
[12] Andries Petrus Engelbrecht,et al. Fitness Landscape Analysis of Weight-Elimination Neural Networks , 2017, Neural Processing Letters.
[13] Edgar A. Bernal,et al. The Loss Surface of XOR Artificial Neural Networks , 2018, Physical review. E.
[14] Hao Shen,et al. Towards a Mathematical Understanding of the Difficulty in Learning with Feedforward Neural Networks , 2016, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[15] Andries Engelbrecht,et al. Loss Surface Modality of Feed-Forward Neural Network Architectures , 2019, 2020 International Joint Conference on Neural Networks (IJCNN).
[16] Hermann Ney,et al. Cross-entropy vs. squared error training: a theoretical and experimental comparison , 2013, INTERSPEECH.
[17] Stefano Soatto,et al. Entropy-SGD: biasing gradient descent into wide valleys , 2016, ICLR.
[18] Kevin Leyton-Brown,et al. Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.
[19] Anna Sergeevna Bosman,et al. Characterising neutrality in neural network error landscapes , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).
[20] Andries Engelbrecht,et al. On the Robustness of Random Walks for Fitness Landscape Analysis , 2019, 2019 IEEE Symposium Series on Computational Intelligence (SSCI).
[21] Radu Timofte,et al. Adversarial Sampling for Active Learning , 2018, 2020 IEEE Winter Conference on Applications of Computer Vision (WACV).
[22] Heike Trautmann,et al. Automated Algorithm Selection: Survey and Perspectives , 2018, Evolutionary Computation.
[23] P. Werbos,et al. Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .
[24] Andries Petrus Engelbrecht,et al. A parameter-free particle swarm optimization algorithm using performance classifiers , 2019, Inf. Sci..
[25] P. Stadler. Fitness Landscapes , 1993 .
[26] R. Srikant,et al. Understanding the Loss Surface of Neural Networks for Binary Classification , 2018, ICML.
[27] Andries Petrus Engelbrecht,et al. A progressive random walk algorithm for sampling continuous fitness landscapes , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).
[28] Jorge Nocedal,et al. On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima , 2016, ICLR.
[29] Yann LeCun,et al. Open Problem: The landscape of the loss surfaces of multilayer networks , 2015, COLT.
[30] Daniel N. Wilke,et al. Resolving learning rates adaptively by locating stochastic non-negative associated gradient projection points using line searches , 2020, Journal of Global Optimization.
[31] Carola Doerr,et al. Adaptive landscape analysis , 2019, GECCO.
[32] Andries Petrus Engelbrecht,et al. Overfitting by PSO trained feedforward neural networks , 2010, IEEE Congress on Evolutionary Computation.
[33] L. Darrell Whitley,et al. The dispersion metric and the CMA evolution strategy , 2006, GECCO.
[34] Sébastien Vérel,et al. New features for continuous exploratory landscape analysis based on the SOO tree , 2019, FOGA '19.
[35] Andries Petrus Engelbrecht,et al. Quantifying ruggedness of continuous landscapes using entropy , 2009, 2009 IEEE Congress on Evolutionary Computation.
[36] Lin Sun,et al. Feature selection using neighborhood entropy-based uncertainty measures for gene expression data classification , 2019, Inf. Sci..
[37] Robert Hecht-Nielsen,et al. On the Geometry of Feedforward Neural Network Error Surfaces , 1993, Neural Computation.
[38] Manali Sharma,et al. Evidence-based uncertainty sampling for active learning , 2016, Data Mining and Knowledge Discovery.
[39] Yann Dauphin,et al. Empirical Analysis of the Hessian of Over-Parametrized Neural Networks , 2017, ICLR.
[40] Ievgen Redko,et al. Deep Neural Networks Are Congestion Games: From Loss Landscape to Wardrop Equilibrium and Beyond , 2020, AISTATS.
[41] William A. Gale,et al. A sequential algorithm for training text classifiers , 1994, SIGIR '94.
[42] Andries Petrus Engelbrecht,et al. Steep gradients as a predictor of PSO failure , 2013, GECCO '13 Companion.
[43] A. H. Chen,et al. HDPS: Heart disease prediction system , 2011, 2011 Computing in Cardiology.
[44] Andries Petrus Engelbrecht,et al. Visualising Basins of Attraction for the Cross-Entropy and the Squared Error Neural Network Loss Functions , 2019, Neurocomputing.
[45] Andries Petrus Engelbrecht,et al. Analysis and classification of optimisation benchmark functions and benchmark suites , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).
[46] Lawrence D. Jackel,et al. Large Automatic Learning, Rule Extraction, and Generalization , 1987, Complex Syst..
[47] Andries Petrus Engelbrecht,et al. Characterising the searchability of continuous optimisation problems for PSO , 2014, Swarm Intelligence.
[48] Andries Petrus Engelbrecht,et al. Ruggedness, funnels and gradients in fitness landscapes and the effect on PSO performance , 2013, 2013 IEEE Congress on Evolutionary Computation.
[49] Terry Jones,et al. Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms , 1995, ICGA.
[50] Saman K. Halgamuge,et al. Quantifying Variable Interactions in Continuous Optimization Problems , 2017, IEEE Transactions on Evolutionary Computation.
[51] Fred A. Hamprecht,et al. Essentially No Barriers in Neural Network Energy Landscape , 2018, ICML.
[52] Erry Yulian Triblas Adesta,et al. Investigation of the effect of cutting speed on the Surface Roughness parameters in CNC End Milling using Artificial Neural Network , 2013 .
[53] Andries Petrus Engelbrecht,et al. Progressive gradient walk for neural network fitness landscape analysis , 2018, GECCO.