Data classification with a relaxed model of variable kernel density estimation

In recent years, kernel density estimation has been exploited by computer scientists to model several important problems in machine learning, bioinformatics, and computer vision. However, in case the dimension of the data set is high, then the conventional kernel density estimators suffer poor convergence rates of the pointwise mean square error (MSE) and the integrated mean square error (IMSE). Therefore, design of a novel kernel density estimator that overcomes this problem has been a great challenge for many years. This paper proposes a relaxed model of the variable kernel density estimation and analyzes its performance in data classification applications. It is proved in this paper that, in terms of pointwise MSE, the convergence rate of the relaxed variable kernel density estimator can approach O(n/sup -1/) regardless of the dimension of the data set, where n is the number of sampling instances. Experiments with the data classification applications have shown that the improved convergence rate of the pointwise MSE leads to higher prediction accuracy. In fact, the experimental results have also shown that the data classifier constructed based on the relaxed variable kernel density estimator is capable of delivering the same level of prediction accuracy as the SVM with the Gaussian kernel.

[1]  Yen-Jen Oyang,et al.  ProteMiner-SSM: a web server for efficient analysis of similar protein tertiary substructures , 2004, Nucleic Acids Res..

[2]  CentresMark,et al.  Regularisation in the Selection of Radial Basis Function , 1995 .

[3]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[4]  Yen-Jen Oyang,et al.  A novel learning algorithm for data classification with radial basis function networks , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[5]  David G. Lowe,et al.  Similarity Metric Learning for a Variable-Kernel Classifier , 1995, Neural Computation.

[6]  Sung Yang Bang,et al.  An Efficient Method to Construct a Radial Basis Function Neural Network Classifier , 1997, Neural Networks.

[7]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[8]  L. Breiman,et al.  Variable Kernel Estimates of Multivariate Densities , 1977 .

[9]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[10]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[11]  Yen-Jen Oyang,et al.  Data classification with radial basis function networks based on a novel kernel density estimation algorithm , 2005, IEEE Transactions on Neural Networks.

[12]  Chris Bishop,et al.  Improving the Generalization Properties of Radial Basis Function Neural Networks , 1991, Neural Computation.

[13]  Garrison W. Cottrell,et al.  Facial Memory Is Kernel Density Estimation (Almost) , 1998, NIPS.

[14]  D. W. Scott,et al.  Variable Kernel Density Estimation , 1992 .

[15]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[16]  Mark J. L. Orr Optimising the widths of radial basis functions , 1998, Proceedings 5th Brazilian Symposium on Neural Networks (Cat. No.98EX209).

[17]  Ian Abramson On Bandwidth Variation in Kernel Estimates-A Square Root Law , 1982 .

[18]  Alan F. Murray,et al.  International Joint Conference on Neural Networks , 1993 .

[19]  Mark J. L. Orr,et al.  Regularization in the Selection of Radial Basis Function Centers , 1995, Neural Computation.

[20]  J. Mark Introduction to radial basis function networks , 1996 .

[21]  Stephan R. Sain,et al.  Zero‐Bias Locally Adaptive Density Estimators , 2002 .

[22]  David J. Fleet,et al.  Robustly Estimating Changes in Image Appearance , 2000, Comput. Vis. Image Underst..

[23]  Chih-Jen Lin,et al.  A comparison of methods for multiclass support vector machines , 2002, IEEE Trans. Neural Networks.

[24]  L. Devroye A Course in Density Estimation , 1987 .