A Formal Model for Definition and Simulation of Generic Neural Networks

This paper presents the definition of a formal data structure, which assists in the characterization of any neural paradigm, with no restriction, including higher-order networks. Within this model, a neural network is mathematically described by specifying some static parameters (number of neurons, order) as well as a set of statistical distributions (which we call the network ‘dynamics’). Once a concrete set of distributions is defined, a single algorithm can simulate any neural paradigm. The presented structure assists in an exhaustive and precise description of the network characteristics and the simulation parameters, providing us with a unified criterion for comparing models and evaluating proposed systems. Though not presented here, the formal model has inspired a software simulator, which implements any system defined according to this structure, thus facilitating the analysis and modelling of neuronal paradigms.

[1]  José Mira Mira,et al.  Parallelization of Connectionist Models Based on a Symbolic Formalism , 1997, IWANN.

[2]  David S. Johnson,et al.  Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .

[3]  José Mira Mira,et al.  A Generic Formulation of Neural Nets as a Model of Parallel and Self-Programming Computation , 1997, IWANN.

[4]  Gonzalo Joya,et al.  Application of high-order hopfield neural networks to the solution of diophantine equations , 1991 .

[5]  Francisco Javier López Aligué,et al.  Generic Neural Network Model and Simulation Toolkit , 1997, IWANN.

[6]  David S. Johnson,et al.  Computers and In stractability: A Guide to the Theory of NP-Completeness. W. H Freeman, San Fran , 1979 .

[7]  Gonzalo Joya,et al.  Hopfield neural network applied to optimization problems: Some theoretical and simulation results , 1997 .

[8]  Gonzalo Joya,et al.  Associating arbitrary-order energy functions to an artificial neural network Implications concerning the resolution of optimization problems , 1997 .

[9]  Berndt Müller,et al.  PERBOOL : Learning Boolean Functions with Back-Propagation , 1990 .

[10]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Francisco Sandoval Hernández,et al.  Associating arbitrary-order energy functions to an artificial neural network: Implications concerning the resolution of optimization problems , 1997, Neurocomputing.

[12]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[13]  Alfred Strey,et al.  EpsiloNN - A Specification Language for the Efficient Parallel Simulation of Neural Networks , 1997, IWANN.