Stability Analysis of Higher-Order Neural Networks for Combinatorial Optimization

Recurrent neural networks with higher order connections, from here on referred to as higher-order neural networks (HONNs), may be used for the solution of combinatorial optimization problems. In Ref. 5 a mapping of the traveling salesman problem (TSP) onto a HONN of arbitrary order was developed, thereby creating a family of related networks that can be used to solve the TSP. In this paper, we explore the trade-off between network complexity and quality of solution that is made available by the HONN mapping of the TSP. The trade-off is investigated by undertaking an analysis of the stability of valid solutions to the TSP in a HONN of arbitrary order. The techniques used to perform the stability analysis are not new, but have been widely used elsewhere in the literature. The original contribution in this paper is the application of these techniques to a HONN of arbitrary order used to solve the TSP. The results of the stability analysis show that the quality of solution is improved by increasing the network complexity, as measured by the order of the network. Furthermore, it is shown that the Hopfield network, as the simplest network in the family of higher-order networks, is expected to produce the poorest quality of solution.

[1]  Satoshi Matsuda Stability of solutions in hopfield neural network , 1995, Systems and Computers in Japan.

[2]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[3]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[4]  Yoshinori Uesaka MATHEMATICAL ASPECTS OF NEURO-DYNAMICS FOR COMBINATORIAL OPTIMIZATION , 1991 .

[5]  C. L. Giles,et al.  Machine learning using higher order correlation networks , 1986 .

[6]  Isabelle Guyon,et al.  High-order neural networks: information storage without errors , 1987 .

[7]  Abbott,et al.  Storage capacity of generalized networks. , 1987, Physical review. A, General physics.

[8]  Andrew Howard Gee,et al.  Problem solving with optimization networks , 1993 .

[9]  Baldi,et al.  Number of stable points for spin-glasses and neural networks of higher orders. , 1987, Physical review letters.

[10]  Satoshi Matsuda "Optimal" neural representation of higher order for quadratic combinatorial optimization , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).

[11]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[12]  Satoshi Matsuda,et al.  "Optimal" Hopfield network for combinatorial optimization with linear cost function , 1998, IEEE Trans. Neural Networks.

[13]  Andrew H. Gee,et al.  Polyhedral Combinatorics and Neural Networks , 1994, Neural Computation.

[14]  Jörgen M. Karlholm Associative memories with short-range, higher order couplings , 1993, Neural Networks.

[15]  Mahesan Niranjan,et al.  A theoretical investigation into the performance of the Hopfield model , 1990, IEEE Trans. Neural Networks.

[16]  A. Dembo,et al.  High-order absolutely stable neural networks , 1991 .

[17]  Gerhard Reinelt,et al.  TSPLIB - A Traveling Salesman Problem Library , 1991, INFORMS J. Comput..

[18]  Raymond L. Watrous,et al.  Induction of Finite-State Languages Using Second-Order Recurrent Networks , 1992, Neural Computation.

[19]  Pierre Baldi,et al.  Neural networks, orientations of the hypercube, and algebraic threshold functions , 1988, IEEE Trans. Inf. Theory.

[20]  E. Gardner Multiconnected neural network models , 1987 .