A Comparison of Architectural Constraints for Feedforward Neural Diversity Machines
暂无分享,去创建一个
[1] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[2] Juan Julián Merelo Guervós,et al. G-Prop-III: Global Optimization of Multilayer Perceptrons using an Evolutionary Algorithm , 1999, GECCO.
[3] Stefan Wermter,et al. Hybrid neural systems: from simple coupling to fully integrated neural networks , 1999 .
[4] Nathan Intrator,et al. A Hybrid Projection Based and Radial Basis Function Architecture , 2000, Multiple Classifier Systems.
[5] Amit Konar,et al. Two improved differential evolution schemes for faster global search , 2005, GECCO '05.
[6] Wlodzislaw Duch,et al. Optimal transfer function neural networks , 2001, ESANN.
[7] G. Mirchandani,et al. On hidden nodes for neural nets , 1989 .
[8] Yoshua. Bengio,et al. Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..
[9] Rainer Storn,et al. Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..
[10] Pedro Antonio Gutiérrez,et al. Hybrid Artificial Neural Networks: Models, Algorithms and Data , 2011, IWANN.
[11] Tomás Maul,et al. Early experiments with neural diversity machines , 2013, Neurocomputing.