CCHR: Combination of Classifiers Using Heuristic Retraining

In this paper, a new method for improving the performance of combinational classifier systems is proposed. The main idea behind this method is heuristic retraining of artificial neural network (ANN). In combinational classifier systems, whatever the more diversity in results of base classifiers, the better final result will obtained. The new presented method for creating this diversity is called, heuristic retraining. First, an MLP as a base classifier is trained. Then regarding errors of this base classifier, other MLPs are trained heuristically. Because our main concentration is on error-prone data, different classifiers are trained according to the amount of concentration on those data. Finally, the outputs of these retrained MLPs are combined. Although the accuracy of these classifiers is almost similar, because of their different amount of concentration on erroneous data, their outputs have a little correlation. Experimental results show the valuable improvement on two standard datasets, iris and wine.

[1]  Raymond J. Mooney,et al.  Constructing Diverse Classifier Ensembles using Artificial Training Examples , 2003, IJCAI.

[2]  William F. Punch,et al.  Optimizing Classification Ensembles via a Genetic Algorithm for a Web-Based Educational System , 2004, SSPR/SPR.

[3]  Bruce E. Rosen,et al.  Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..

[4]  Horst Bunke,et al.  Creation of classifier ensembles for handwritten word recognition using feature selection algorithms , 2002, Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition.

[5]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[6]  傅强,et al.  Clustering-based selective neural network ensemble , 2005 .

[7]  Pablo M. Granitto,et al.  Selecting diverse members of neural network ensembles , 2000, Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks.

[8]  Mojtaba Vahidi-Asl,et al.  Learn to Detect Phishing Scams Using Learning and Ensemble ?Methods , 2007, 2007 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology - Workshops.

[9]  Jianxin Wu,et al.  Genetic Algorithm based Selective Neural Network Ensemble , 2001, IJCAI.

[10]  Fu Qiang PSO-based approach for neural network ensembles , 2004 .

[11]  David W. Opitz,et al.  Actively Searching for an E(cid:11)ective Neural-Network Ensemble , 1996 .

[12]  Mahmood Fathy,et al.  Improved Face Detection Using Spatial Histogram Features , 2008, IPCV.

[13]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[14]  R. Schapire The Strength of Weak Learnability , 1990, Machine Learning.

[15]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  W. Punch,et al.  Mining Feature Importance : Applying Evolutionary Algorithms within a Web-based Educational System , 2004 .

[17]  L. Shapley,et al.  Optimizing group judgmental accuracy in the presence of interdependencies , 1984 .

[18]  Louisa Lam,et al.  Classifier Combinations: Implementations and Theoretical Issues , 2000, Multiple Classifier Systems.

[19]  Fu Qiang,et al.  Clustering-based selective neural network ensemble , 2005 .

[20]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..

[21]  Zoran Obradovic,et al.  Effective pruning of neural network classifier ensembles , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[22]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.