Decision Combination of Multiple Classifiers

In order to improve the performance in pattern classification, we utilize multiple classifiers and combine their individual decisions to make a final decision. In this paper, we present the combination using Bayesian method and compare minimum errors. This method requires the posteriori probabilities from all classifiers, which may be difficult to calculate in real world because tremendous amounts of training samples are needed. Alternatively, a confusion matrix is developed for approximation. We also use different combining rules for comparisons and apply them to handwritten digit recognition.

[1]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[2]  Bruce W. Schmeiser,et al.  Improving model accuracy using optimal linear combinations of trained neural networks , 1995, IEEE Trans. Neural Networks.

[3]  Sargur N. Srihari,et al.  Decision Combination in Multiple Classifier Systems , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[6]  Shouxian Cheng,et al.  Improved feature reduction in input and feature spaces , 2005, Pattern Recognit..

[7]  Avinash C. Kak,et al.  PCA versus LDA , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Kai Zhang,et al.  A Hybrid Two-Phase Algorithm For Face Recognition , 2004, Int. J. Pattern Recognit. Artif. Intell..

[9]  Ching Y. Suen,et al.  A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Jiangying Zhou,et al.  Discrimination of characters by a multi-stage recognition process , 1994, Pattern Recognit..