Relaxation properties and learning paradigms in complex systems

With respect to the three different paradigms of neural networks generally studied (convergent, oscillatory, chaotic), a fourth is proposed. In some general sense, it makes the precedent ones three particular cases of itself. It is defined as a nonstationary model of a spin- glass like neural net. It has both a dynamics on the spins and on the weights in view of granting to the net a continuous redefinition of its phase space on a purely dynamic basis. So the system displays different behaviors (noisy, chaotic, stable) in function of its finite temporal order parameter, i.e., in function of a finite correlation among the spins acting on the weight dynamics. A first analysis of this model, capable of making nonstationary the probability distribution function on the spins, is developed in comparison with several paradigms of relaxation neural nets, developed in the classical framework of statistical mechanics. The nonstationary, analytically unpredictable, but deterministic and hence computable behavior of such a model is useful to make a neural net able to reckon with recognition tasks of nonsteady inputs and semantical problems.

[1]  H. Horner Dynamic mean field theory of the SK-spin glass , 1984 .

[2]  Bart Kosko,et al.  Unsupervised learning in noise , 1990, International 1989 Joint Conference on Neural Networks.