Pattern classification can be viewed as an ill-posed, inverse problem to which the method of regularization can be applied. In doing so, a proper theoretical framework is provided for the application of radial basis function (RBF) networks to pattern classification, with strong links to the classical kernel regression estimator (KRE)-based classifiers that estimate the underlying posterior class densities. Assuming that the training patterns are labeled with binary-valued vectors indicating their class membership, a regularized solution can be designed so that each resultant network output (one for each class) can be interpreted as a nonparametric estimator of the corresponding posterior, i.e., conditional, class distribution. These RBFs generalize the classical KREs, e.g., the Parzen window estimators (PWEs), which can therefore be recovered as a particular limiting case. The authors describe analytically how constraining the classifier network coefficients to be positive during their solution alters the nature of the original regularization problem, and demonstrate experimentally the beneficial effect that such a constraint has on classifier complexity.<<ETX>>
[1]
F. Girosi,et al.
Networks for approximation and learning
,
1990,
Proc. IEEE.
[2]
G. Wahba.
Spline models for observational data
,
1990
.
[3]
John Moody,et al.
Fast Learning in Networks of Locally-Tuned Processing Units
,
1989,
Neural Computation.
[4]
T. Cacoullos.
Estimation of a multivariate density
,
1966
.
[5]
David S. Broomhead,et al.
Multivariable Functional Interpolation and Adaptive Networks
,
1988,
Complex Syst..
[6]
C. J. Stone,et al.
Consistent Nonparametric Regression
,
1977
.
[7]
Adam Krzyżak.
On Exponential Bounds On The Bayes Risk Of Nonparametric Classification Rules
,
1991
.
[8]
D. Broomhead,et al.
Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks
,
1988
.