A neural dynamics model for structural optimization—Theory

Abstract A neural dynamics model is presented for optimal design of structures. The Lyapunov function is used to develop the neural dynamics structural optimization model and prove its stability. An exterior penalty function method is adopted to formulate an objective function for the general constrained structural optimization problem in the form of the Lyapunov function. A learning rule is developed by integrating the Kuhn-Tucker necessary condition for a local minimum with the formulated Lyapunov function. The topology of the neural dynamics model consists of two distinct layers: variable layer and constraint layer. The numbers of nodes in the variable and constraint layers correspond to the numbers of design variables and constraints in the structural optimization problem. Both excitatory and inhibitory connection types are used for adjusting the states of the nodes. In addition to commonly-used inter-layer connections, recurrent connections are used to represent the gradient information of the objective function. In a companion paper the neural dynamics model is applied to optimum plastic design of steel structures.

[1]  William H. Press,et al.  Numerical recipes , 1990 .

[2]  Leon O. Chua,et al.  Neural networks for nonlinear programming , 1988 .

[3]  Hojjat Adeli,et al.  A Concurrent Adaptive Conjugate Gradient Learning Algorithm On Mimd Shared-Memory Machines , 1993, Int. J. High Perform. Comput. Appl..

[4]  Jasbir S. Arora,et al.  Introduction to Optimum Design , 1988 .

[5]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[6]  Hojjat Adeli,et al.  Parallel Processing in Computational Mechanics , 1992 .

[7]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[8]  Hojjat Adeli,et al.  Parallel backpropagation learning algorithms on CRAY Y-MP8/864 supercomputer , 1993, Neurocomputing.

[9]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[10]  Robert Hecht-Nielsen,et al.  Applications of counterpropagation networks , 1988, Neural Networks.

[11]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[12]  Hojjat Adeli,et al.  Fuzzy Neural Network Learning Model for Image Recognition , 1993 .

[13]  Stephen Grossberg,et al.  Absolute stability of global pattern formation and parallel memory storage by competitive neural networks , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[14]  M. Hirsch,et al.  Differential Equations, Dynamical Systems, and Linear Algebra , 1974 .

[15]  Morton Nadler,et al.  The stability of motion , 1961 .

[16]  Hojjat Adeli,et al.  Object-oriented backpropagation and its application to structural design , 1994, Neurocomputing.

[17]  Mitsuo Kawato,et al.  Feedback-error-learning neural network for trajectory control of a robotic manipulator , 1988, Neural Networks.

[18]  Hojjat Adeli,et al.  A model of perceptron learning with a hidden layer for engineering design , 1991, Neurocomputing.