An Extended MKNN: Modified K-Nearest Neighbor

In this paper, a new classification method for enhancing the performance of K-Nearest Neighbor is proposed which uses robust neighbors in training data. The robust neighbors are detected using a validation process. This method is more robust than traditional equivalent methods. This new classification method is called Modified K-Nearest Neighbor. Inspired the traditional KNN algorithm, the main idea is classifying the test samples according to their neighbor tags. This method is a kind of weighted KNN so that these weights are determined using a different procedure. The procedure computes the fraction of the same labeled neighbors to the total number of neighbors. The proposed method is evaluated on a variety of several standard UCI data sets. Experiments show the excellent improvement in accuracy in comparison with KNN method.

[1]  Hamid Parvin,et al.  A New Approach to Improve the Vote-Based Classifier Selection , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[2]  Hamid Parvin,et al.  Using Clustering for Generating Diversity in Classifier Ensemble , 2009, J. Digit. Content Technol. its Appl..

[3]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[4]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[5]  A. Jówik,et al.  A learning scheme for a fuzzy k-NN rule , 1983 .

[6]  Martin E. Hellman,et al.  The Nearest Neighbor Classification Rule with a Reject Option , 1970, IEEE Trans. Syst. Sci. Cybern..

[7]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[8]  Hamid Parvin,et al.  A New Method for Constructing Classifier Ensembles , 2009, J. Digit. Content Technol. its Appl..

[9]  Morteza Analoui,et al.  A Scalable Method for Improving the Performance of Classifiers in Multiclass Applications by Pairwise Classifiers and GA , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[10]  Sahibsingh A. Dudani The Distance-Weighted k-Nearest-Neighbor Rule , 1976, IEEE Transactions on Systems, Man, and Cybernetics.

[11]  H. Alizadeh,et al.  Divide & Conquer Classification and Optimization by Genetic Algorithm , 2008, 2008 Third International Conference on Convergence and Hybrid Information Technology.

[12]  Anil K. Jain,et al.  NOTE ON DISTANCE-WEIGHTED k-NEAREST NEIGHBOR RULES. , 1978 .

[13]  Morteza Analoui,et al.  CCHR: Combination of Classifiers Using Heuristic Retraining , 2008, 2008 Fourth International Conference on Networked Computing and Advanced Information Management.

[14]  J. L. Hodges,et al.  Discriminatory Analysis - Nonparametric Discrimination: Consistency Properties , 1989 .

[15]  Sergio Bermejo,et al.  Adaptive soft k-nearest-neighbour classifiers , 2000, Pattern Recognit..

[16]  Adam Józwik,et al.  A learning scheme for a fuzzy k-NN rule , 1983, Pattern Recognit. Lett..

[17]  David G. Stork,et al.  Pattern Classification , 1973 .

[18]  Larry D. Hostetler,et al.  k-nearest-neighbor Bayes-risk estimation , 1975, IEEE Trans. Inf. Theory.

[19]  M. Mohammadi,et al.  Neural Network Ensembles Using Clustering Ensemble and Genetic Algorithm , 2008, 2008 Third International Conference on Convergence and Hybrid Information Technology.