Validation Based Modified K‐Nearest Neighbor

In this paper, a new classification method for enhancing the performance of K‐Nearest Neighbor is proposed which uses robust neighbors in training data. The robust neighbors are detected using a validation process. This method is more robust than traditional equivalent methods. This new classification method is called Modified K‐Nearest Neighbor. Inspired the traditional KNN algorithm, the main idea is classifying the test samples according to their neighbor tags. This method is a kind of weighted KNN so that these weights are determined using a different procedure. The procedure computes the fraction of the same labeled neighbors to the total number of neighbors. The proposed method is evaluated on a variety of several standard UCI data sets. Experiments show the excellent improvement in accuracy in comparison with KNN method.

[1]  Hamid Parvin,et al.  Using Clustering for Generating Diversity in Classifier Ensemble , 2009, J. Digit. Content Technol. its Appl..

[2]  David G. Stork,et al.  Pattern Classification , 1973 .

[3]  M. Mohammadi,et al.  Neural Network Ensembles Using Clustering Ensemble and Genetic Algorithm , 2008, 2008 Third International Conference on Convergence and Hybrid Information Technology.

[4]  Sergio Bermejo,et al.  Adaptive soft k-nearest-neighbour classifiers , 2000, Pattern Recognit..

[5]  Sahibsingh A. Dudani The Distance-Weighted k-Nearest-Neighbor Rule , 1976, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  Hamid Parvin,et al.  A New Method for Constructing Classifier Ensembles , 2009, J. Digit. Content Technol. its Appl..

[7]  James M. Keller,et al.  A fuzzy K-nearest neighbor algorithm , 1985, IEEE Transactions on Systems, Man, and Cybernetics.

[8]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[9]  Morteza Analoui,et al.  CCHR: Combination of Classifiers Using Heuristic Retraining , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[10]  A. Jówik,et al.  A learning scheme for a fuzzy k-NN rule , 1983 .

[11]  H. Alizadeh,et al.  Divide & Conquer Classification and Optimization by Genetic Algorithm , 2008, 2008 Third International Conference on Convergence and Hybrid Information Technology.

[12]  J. L. Hodges,et al.  Discriminatory Analysis - Nonparametric Discrimination: Consistency Properties , 1989 .

[13]  Morteza Analoui,et al.  A Scalable Method for Improving the Performance of Classifiers in Multiclass Applications by Pairwise Classifiers and GA , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[14]  Hamid Parvin,et al.  A New Approach to Improve the Vote-Based Classifier Selection , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[15]  Larry D. Hostetler,et al.  k-nearest-neighbor Bayes-risk estimation , 1975, IEEE Trans. Inf. Theory.

[16]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[17]  Martin E. Hellman,et al.  The Nearest Neighbor Classification Rule with a Reject Option , 1970, IEEE Trans. Syst. Sci. Cybern..

[18]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.