Consistent identification of NARX models via regularization networks

Generalization networks are nonparametric estimators obtained from the application of Tychonov regularization or Bayes estimation to the hypersurface reconstruction problem. Under symmetry assumptions, they are a particular type of radial basis function neural network. In this correspondence, it is shown that such networks guarantee consistent identification of a very general (infinite-dimensional) class of NARX models. The proofs are based on the theory of reproducing kernel Hilbert spaces and the notion of frequency of time probability, by means of which it is not necessary to assume that the input is sampled from a stochastic process.