Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks
暂无分享,去创建一个
[1] Joel W. Burdick,et al. Finding antipodal point grasps on irregularly shaped objects , 1992, IEEE Trans. Robotics Autom..
[2] Bedri C. Cetin,et al. Terminal repeller unconstrained subenergy tunneling (trust) for fast global optimization , 1993 .
[3] N. E. Cotter,et al. A diffusion process for global optimization in neural networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.
[4] E. K. Blum,et al. Approximation of Boolean Functions by Sigmoidal Networks: Part I: XOR and Other Two-Variable Functions , 1989, Neural Computation.
[5] Joel W. Burdick,et al. Efficient global redundant configuration resolution via sub-energy tunneling and terminal repelling , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.
[6] Amir Atiya. Learning algorithms for neural networks , 1991 .
[7] Michail Zak,et al. Terminal attractors in neural networks , 1989, Neural Networks.
[8] Geoffrey E. Hinton,et al. Learning and relearning in Boltzmann machines , 1986 .