Principles of computational dynamics: applications to parallel and neural computations

In this paper, starting from a general discussion on neural network dynamics from the standpoint of statistical mechanics, we discuss three different strategies to deal with the problem of pattern recognition in neural nets. Particularly we emphasized the role of matching the intrinsic correlations within the input patterns, to solve the problem of the optimal pattern recognition. In this context, the first two strategies, we applied to different problems and we discuss in this paper, consist essentially in adding either white noise or colored noise (deterministic chaos) on the input pattern pre-processing, to make easier for a classical backpropagation algorithm the class separation, respectively because the input patterns are too correlated among themselves or, on the contrary, are too noisy. The third more radical strategy, we applied to very hard pattern recognition problems in HEP experiments, consists in an automatic (dynamic) redefinition of the same net topology on the inner correlations of the inputs.