Complexity analysis of RBF networks for pattern recognition

The problem of non-parametric probability density function (PDF) estimation using Radial Basis Function (RBF) Neural Networks is addressed here. We investigate two criteria, based on a modified Kullback-Leibler distance, that lead to an appropriate choice of the network architecture complexity. In the first criterion the modification consists in the addition of a term that penalizes complex architectures (MPL criterion). The second strategy, involves the regularization of the network through the imposition of lower bounds on the standard deviation derived from conditions of existence of rejection tests (LBSD criterion). Experimental results indicate that the MPL criterion outperforms the LBSD method.

[1]  Geoffrey E. Hinton,et al.  Simplifying Neural Networks by Soft Weight-Sharing , 1992, Neural Computation.

[2]  A. Barron Uniformly Powerful Goodness of Fit Tests , 1989 .

[3]  Tomaso Poggio,et al.  Computational vision and regularization theory , 1985, Nature.

[4]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[5]  Demetri Terzopoulos,et al.  Regularization of Inverse Visual Problems Involving Discontinuities , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  J. Kittler,et al.  Minimum complexity estimator for RBF networks architecture selection , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[7]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[8]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..

[9]  H. Akaike A new look at the statistical model identification , 1974 .

[10]  Andrew R. Barron,et al.  Minimum complexity density estimation , 1991, IEEE Trans. Inf. Theory.