Training neural networks by stochastic optimisation
暂无分享,去创建一个
[1] Geoffrey E. Hinton,et al. A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..
[2] Roberto Brunelli,et al. Training neural nets through stochastic minimization , 1994, Neural Networks.
[3] Eric Backer,et al. Finding point correspondences using simulated annealing , 1995, Pattern Recognit..
[4] Werner Ebeling,et al. Optimization of NP-Complete Problems by Boltzmann-Darwin Strategies Including Life Cycles , 1988 .
[5] Marwan A. Jabri,et al. Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networks , 1992, IEEE Trans. Neural Networks.
[6] Klaus-Robert Müller,et al. Statistical Theory of Overtraining - Is Cross-Validation Asymptotically Effective? , 1995, NIPS.
[7] N. Baba. Convergence of a random optimization method for constrained optimization problems , 1981 .
[8] K. P. Unnikrishnan,et al. Alopex: A Correlation-Based Learning Algorithm for Feedforward and Recurrent Neural Networks , 1994, Neural Computation.
[9] Hyun Seung Yang,et al. Robust image segmentation using genetic algorithm with a fuzzy measure , 1996, Pattern Recognit..
[10] John H. Holland,et al. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .
[11] Stephen T. Barnard,et al. A Stochastic Approach to Stereo Vision , 1986, AAAI.
[12] Thomas Kailath,et al. Model-free distributed learning , 1990, IEEE Trans. Neural Networks.
[13] R. L. Anderson,et al. RECENT ADVANCES IN FINDING BEST OPERATING CONDITIONS , 1953 .
[14] Kenneth A. De Jong,et al. Are Genetic Algorithms Function Optimizers? , 1992, PPSN.
[15] P. Carnevali,et al. Image processing by stimulated annealing , 1985 .
[16] Roger J.-B. Wets,et al. Minimization by Random Search Techniques , 1981, Math. Oper. Res..
[17] Lane A. Hemaspaandra,et al. Using simulated annealing to design good codes , 1987, IEEE Trans. Inf. Theory.
[18] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[19] Antanas Verikas,et al. Colour image segmentation by modular neural network , 1997, Pattern Recognit. Lett..
[20] C. D. Gelatt,et al. Optimization by Simulated Annealing , 1983, Science.
[21] Edward J. Delp,et al. A Cost Minimization Approach to Edge Detection Using Simulated Annealing , 1992, IEEE Trans. Pattern Anal. Mach. Intell..
[22] L. Darrell Whitley,et al. Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..
[23] N. Metropolis,et al. Equation of State Calculations by Fast Computing Machines , 1953, Resonance.
[24] Marc Parizeau,et al. Optimizing the cost matrix for approximate string matching using genetic algorithms , 1998, Pattern Recognit..
[25] Günther F. Schrack,et al. Optimized relative step size random searches , 1976, Math. Program..
[26] Norio Baba,et al. A new approach for finding the global minimum of error function of neural networks , 1989, Neural Networks.
[27] Yoh-Han Pao,et al. Combinatorial optimization with use of guided evolutionary simulated annealing , 1995, IEEE Trans. Neural Networks.
[28] Paolo Carnevali,et al. Image Processing by Simulated Annealing , 1985, IBM J. Res. Dev..
[29] Yoshio Mogami,et al. A hybrid algorithm for finding the global minimum of error function of neural networks and its applications , 1994, Neural Networks.
[30] W. M. Jenkins,et al. Genetic Algorithms and Neural Networks , 1999, Neural Networks in the Analysis and Design of Structures.
[31] Ron Meir,et al. A Parallel Gradient Descent Method for Learning in Analog VLSI Neural Networks , 1992, NIPS.