Improving Product by Moderating k-NN Classifiers

The veto effect caused by contradicting experts outputting zero probability estimates leads to fusion strategies performing sub optimally. This can be resolved using Moderation. The Moderation formula is derived for the k-NN classifier using a bayesian prior. The merits of moderation are examined on real data sets.

[1]  Sargur N. Srihari,et al.  Dynamic classifier combination using neural network , 1995, Electronic Imaging.

[2]  David H. Wolpert,et al.  Stacked generalization , 1992, Neural Networks.

[3]  Josef Kittler,et al.  Improving the performance of the product fusion strategy , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[4]  Josef Kittler,et al.  Population bias control for bagging k-NN experts , 2001, SPIE Defense + Commercial Sensing.

[5]  Michael I. Jordan,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1994, Neural Computation.

[6]  G DietterichThomas An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees , 2000 .

[7]  Josef Kittler,et al.  Experimental evaluation of expert fusion strategies , 1999, Pattern Recognit. Lett..

[8]  Sargur N. Srihari,et al.  Decision Combination in Multiple Classifier Systems , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Majid Ahmadi,et al.  Recognition of handwritten numerals with multiple feature and multistage classifier , 1995, Pattern Recognit..

[11]  Galina L. Rogova,et al.  Combining the results of several neural network classifiers , 1994, Neural Networks.

[12]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[13]  Kevin W. Bowyer,et al.  Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..