A novel weighted nearest neighbor ensemble classifier

Recent works have shown that combining several classifiers is an effective method to improve classification accuracy. Many ensemble approaches have been introduced such as bagging and boosting that have reduced the generalization error of different classifiers; however, these methods could not increase the performance of Nearest Neighbor (NN) classifier. In this paper, a novel weighted ensemble technique (WNNE) is presented for improving the performance of NN classifier. In fact, WNNE is a combination of several NN classifiers, which have different subsets of input feature set. The algorithm assigns a weight to each classifier, and uses a weighted vote mechanism among these classifiers to determine the output of ensemble. We evaluated the proposed method on several datasets from UCI Repository and compared with NN classifier and Random subspace method (RSM). The results show that our method outperforms these two approaches.

[1]  Naohiro Ishii,et al.  Text Classification by Combining Different Distance Functions with Weights , 2007 .

[2]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[3]  Zhi-Hua Zhou,et al.  Editing Training Data for kNN Classifiers with Neural Network Ensemble , 2004, ISNN.

[4]  Shihua Zhu,et al.  Active Learning for kNN Based on Bagging Features , 2008, 2008 Fourth International Conference on Natural Computation.

[5]  Mansoor Zolghadri Jahromi,et al.  A proposed method for learning rule weights in fuzzy rule-based classification systems , 2008, Fuzzy Sets Syst..

[6]  Naohiro Ishii,et al.  Text Classification by Combining Different Distance Functions withWeights , 2006, Seventh ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD'06).

[7]  Muhammad Atif Tahir,et al.  Improving Nearest Neighbor Classifier Using Tabu Search and Ensemble Distance Metrics , 2006, Sixth International Conference on Data Mining (ICDM'06).

[8]  M. Zolghadri Jahromi,et al.  A cost sensitive learning algorithm for intrusion detection , 2010, 2010 18th Iranian Conference on Electrical Engineering.

[9]  Dimitrios Gunopulos,et al.  Locally Adaptive Metric Nearest-Neighbor Classification , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Ralph Martinez,et al.  Reduction Techniques for Exemplar-Based Learning Algorithms , 1998 .

[11]  Robert Ivor John,et al.  A method of learning weighted similarity function to improve the performance of nearest neighbor , 2009, Inf. Sci..

[12]  Chengqi Zhang,et al.  A Novel Prototype Reduction Method for the K-Nearest Neighbor Algorithm with K >= 1 , 2010, PAKDD.

[13]  Nicolas Garc ´ õa-Pedrajas Boosting k-Nearest Neighbor Classifier by Means of Input Space Projection , 2008 .

[14]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[15]  Carlotta Domeniconi,et al.  Nearest neighbor ensemble , 2004, ICPR 2004.

[16]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[17]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[18]  Jerome H. Friedman,et al.  Flexible Metric Nearest Neighbor Classification , 1994 .

[19]  Mansoor Zolghadri Jahromi,et al.  A Novel Piecewise Linear Clustering Technique Based on Hyper Plane Adjustment , 2008, CSICC.

[20]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.