Associating arbitrary-order energy functions to an artificial neural network: Implications concerning the resolution of optimization problems
暂无分享,去创建一个
We have studied the restrictions that a first order asynchronous feedback neural network must fulfill to be associated to an arbitrary order energy function of the kind described by Kobuchi [6], i.e., the network evolution is related to the descent to a minimum of such a function. These restrictions do not avoid the association of the even order energy functions to a first order network. However, for the odd order energy functions, most of the weights of each neuron must be zero. This result discards using first order neural networks for the solution of optimization problems associated with an odd order function, justifying in this way the use of high order neural networks. For these ones, we have obtained a general expression of their possible energy functions, which includes, as a special case, the high order generalization of Hopfield’s energy functions until now used, for example, in [5], [8].
[1] Gonzalo Joya,et al. Application of high-order hopfield neural networks to the solution of diophantine equations , 1991 .
[2] Y. Kobuchi,et al. Quasi-symmetric logic networks have Lyapunov functions , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.
[3] Tariq Samad,et al. High-order Hopfield and Tank optimization networks , 1990, Parallel Comput..
[4] J. Hopfield. Neurons withgraded response havecollective computational properties likethoseoftwo-state neurons , 1984 .