Adaptive edited natural neighbor algorithm

Reduction techniques can reduce prohibitive computational costs and the storage requirements for classifying patterns while maintaining classification accuracy. The edited nearest neighbor rule is one of the most popular reduction technique, which removes noisy patterns that are not correctly classified by their k-nearest neighbors. However, selection of neighborhood parameters is an unsolved problem for the traditional neighborhood construction algorithms such as k-nearest neighbor and e-neighborhood. To solve the problem, we present a novel editing algorithm called adaptive Edited Natural Neighbor algorithm (ENaN). ENaN aims to eliminate the noisy patterns based on the concept of natural neighbor which are obtained adaptively by the search algorithm of natural neighbor. The main advantages are that ENaN does not need any parameters and can degrade the effect of noisy patterns. The adaptive ENaN algorithm can be easily applied into other reduction algorithms as a noisy filter. Experiments show that the proposed approach effectively removes the noisy patterns while keeping more reasonable class boundaries and improves the performance of two condensation methods in terms of both accuracy and reduction rate greatly.

[1]  Kyoung-jae Kim Artificial neural networks with evolutionary instance selection for financial forecasting , 2006, Expert Syst. Appl..

[2]  José Francisco Martínez Trinidad,et al.  A new fast prototype selection method based on clustering , 2010, Pattern Analysis and Applications.

[3]  Hadi Sadoghi Yazdi,et al.  IRAHC: Instance Reduction Algorithm using Hyperrectangle Clustering , 2015, Pattern Recognit..

[4]  Gautam Bhattacharya,et al.  A probabilistic framework for dynamic k estimation in kNN classifiers with certainty factor , 2015, 2015 Eighth International Conference on Advances in Pattern Recognition (ICAPR).

[5]  Loris Nanni,et al.  Particle swarm optimization for prototype reduction , 2009, Neurocomputing.

[6]  A. W. Moore An Intoductory Tutorial on Kd-trees Extract from Andrew Moore's Phd Thesis: Eecient Memory-based L Earning for Robot Control , 1991 .

[7]  Francesc J. Ferri,et al.  An efficient prototype merging strategy for the condensed 1-NN rule through class-conditional hierarchical clustering , 2002, Pattern Recognit..

[8]  Q. Henry Wu,et al.  Spectral Graph Optimization for Instance Reduction , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Q. Henry Wu,et al.  A class boundary preserving algorithm for data condensation , 2011, Pattern Recognit..

[10]  Juan Li,et al.  A new fast reduction technique based on binary nearest neighbor tree , 2015, Neurocomputing.

[11]  Loris Nanni,et al.  A clustering method for automatic biometric template selection , 2006, Pattern Recognit..

[12]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[13]  Dennis L. Wilson,et al.  Asymptotic Properties of Nearest Neighbor Rules Using Edited Data , 1972, IEEE Trans. Syst. Man Cybern..

[14]  Karol Kozak,et al.  Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data , 2007 .

[15]  Jun-Hai Zhai,et al.  Instances selection for NN with fuzzy rough technique , 2011, 2011 International Conference on Machine Learning and Cybernetics.

[16]  Fabrizio Angiulli,et al.  Fast Nearest Neighbor Condensation for Large Data Sets Classification , 2007, IEEE Transactions on Knowledge and Data Engineering.

[17]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[18]  X. Ren,et al.  Mathematics , 1935, Nature.

[19]  Amir F. Atiya,et al.  A Novel Template Reduction Approach for the $K$-Nearest Neighbor Method , 2009, IEEE Transactions on Neural Networks.

[20]  Francisco Herrera,et al.  Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study , 2003, IEEE Trans. Evol. Comput..

[21]  S. S. Stevens Mathematics, measurement, and psychophysics. , 1951 .

[22]  Chris Mellish,et al.  On the Consistency of Information Filters for Lazy Learning Algorithms , 1999, PKDD.

[23]  Peter E. Hart,et al.  The condensed nearest neighbor rule (Corresp.) , 1968, IEEE Trans. Inf. Theory.

[24]  George D. C. Cavalcanti,et al.  ATISA: Adaptive Threshold-based Instance Selection Algorithm , 2013, Expert Syst. Appl..

[25]  Francisco Herrera,et al.  FRPS: A Fuzzy Rough Prototype Selection method , 2013, Pattern Recognit..

[26]  Amal Miloud-Aouidate,et al.  Ant Colony Prototype Reduction Algorithm for kNN Classification , 2012, 2012 IEEE 15th International Conference on Computational Science and Engineering.