Dynamic Management of Multiple Classifiers in Complex Recognition System

There are different kinds of multiple classifiers in complex recognition systems in pursuit of better recognition capabilities. To exploit the classifiers' potential as individual ones sufficiently and enable them to work cooperatively for the best classification results, they need to be considered as a whole and be dynamically managed according to the changing recognition occasions. In this paper, we present the conception of distributed Multiple Classifiers Management (MCM) and a self-adaptive recursive MCM model based on Mixture-of-Experts (ME). A control subsystem is consisted in the model, which allows the classification progress to be controlled by the systems' priori information when necessary. The model adjusts its parameters dynamically according to the current recognition state and gives the recognition results by combining the current individual classifiers' results with the previous combination result under priori information's control. An algorithm based on one step error correction is presented to acquire the model's parameters dynamically. It takes the previous times' ensemble classification results as true and corrects the current weights of the classifiers. At last, an experiment on the recognition of space objects is simulated. The experiment results show that the MCM model in this paper is effective for complex recognition system containing heterogeneous classifiers on improving the recognition rate and robustness.

[1]  Raj Acharya,et al.  Multi sensor data fusion within hierarchical neural networks , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[2]  Yuehui Chen,et al.  Ensemble Voting System for Multiclass protein fold Recognition , 2008, Int. J. Pattern Recognit. Artif. Intell..

[3]  C. Kaynak,et al.  Techniques for Combining Multiple Learners , 1998 .

[4]  Nasser M. Nasrabadi,et al.  Fusion of FLIR automatic target recognition algorithms , 2003, Inf. Fusion.

[5]  G. Winkler,et al.  A combination of statistical and syntactical pattern recognition applied to classification of unconstrained handwritten numerals , 1980, Pattern Recognit..

[6]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[7]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Robi Polikar,et al.  Combining classifiers for multisensor data fusion , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[9]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[10]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[11]  Hamid Parvin,et al.  A New Method for Constructing Classifier Ensembles , 2009, J. Digit. Content Technol. its Appl..

[12]  Pierre Gançarski,et al.  Collaborative multi-step mono-level multi-strategy classification , 2007, Multimedia Tools and Applications.

[13]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.

[15]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[16]  Olac Fuentes,et al.  Face Recognition Using Unlabeled Data , 2003, Computación y Sistemas.