Algorithm of designing compound recognition system on the basis of combining classifiers with simultaneous splitting feature space into competence areas

The paper presents the novel adaptive splitting and selection algorithm (AdaSS) used for learning compound pattern recognition system. Splitting a feature space into its constituents and selection of the best area classifier from the pool of available recognizers for each region are key processes of the proposed model. Both take place simultaneously as part of a compound optimization process aimed at maximizing system performance. Evolutionary algorithms are used to find out the optimal solution. The results of experiments for algorithm evaluation purposes prove the quality of the proposed approach.

[1]  Robert P. W. Duin,et al.  The combining classifier: to train or not to train? , 2002, Object recognition supported by user interaction for service robots.

[2]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Louis Vuurpijl,et al.  An overview and comparison of voting methods for pattern recognition , 2002, Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition.

[4]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[5]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[6]  Lakhmi C. Jain,et al.  Designing classifier fusion systems by genetic algorithms , 2000, IEEE Trans. Evol. Comput..

[7]  Kagan Tumer,et al.  Linear and Order Statistics Combiners for Pattern Classification , 1999, ArXiv.

[8]  Robert P. W. Duin,et al.  Experiments with Classifier Combining Rules , 2000, Multiple Classifier Systems.

[9]  C. K. Chow,et al.  Statistical Independence and Threshold Functions , 1965, IEEE Trans. Electron. Comput..

[10]  Derek Partridge,et al.  Software Diversity: Practical Statistics for Its Measurement and Exploitation | Draft Currently under Revision , 1996 .

[11]  Wlodzislaw Duch,et al.  Ensembles of Similarity-based Models , 2001, Intelligent Information Systems.

[12]  Robert P. W. Duin,et al.  Limits on the majority vote accuracy in classifier fusion , 2003, Pattern Analysis & Applications.

[13]  Ludmila I. Kuncheva,et al.  Clustering-and-selection model for classifier combination , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[14]  Sherif Hashem,et al.  Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.

[15]  Yoram Baram,et al.  Partial Classification: The Benefit of Deferred Decision , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  W. H. Highleyman,et al.  The design and analysis of pattern recognition experiments , 1962 .

[17]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[18]  Ludmila I. Kuncheva,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2004 .

[19]  Anil K. Jain,et al.  Data clustering: a review , 1999, CSUR.

[20]  James C. Bezdek,et al.  Decision templates for multiple classifier fusion: an experimental comparison , 2001, Pattern Recognit..

[21]  K. Goebel Choosing Classifiers for Decision Fusion , 2004 .

[22]  Josef Kittler,et al.  Pattern recognition : a statistical approach , 1982 .

[23]  Nageswara S. V. Rao A Generic Sensor Fusion Problem: Classification and Function Estimation , 2004, Multiple Classifier Systems.

[24]  David G. Stork,et al.  Pattern Classification , 1973 .

[25]  Fabio Roli,et al.  A theoretical and experimental analysis of linear combiners for multiple classifier systems , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[26]  C. J. Whitaker,et al.  Ten measures of diversity in classifier ensembles: limits for two classifiers , 2001 .

[27]  Michal Wozniak Experiments with Trained and Untrained Fusers , 2008, Innovations in Hybrid Intelligent Systems.

[28]  Bogdan Gabrys,et al.  Classifier selection for majority voting , 2005, Inf. Fusion.

[29]  Mario Vento,et al.  A Cascaded Multiple Expert System for Verification , 2000, Multiple Classifier Systems.

[30]  R. Siegler Three aspects of cognitive development , 1976, Cognitive Psychology.

[31]  Bogdan Gabrys,et al.  Genetic algorithms in classifier fusion , 2006, Appl. Soft Comput..

[32]  Hirotaka Inoue,et al.  Optimizing a Multiple Classifier System , 2002, PRICAI.

[33]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[34]  Lior Rokach,et al.  Decomposition methodology for classification tasks: a meta decomposer framework , 2006, Pattern Analysis and Applications.

[35]  Fabio Roli,et al.  Design of effective multiple classifier systems by clustering of classifiers , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[36]  Fabio Roli,et al.  Bayesian Analysis of Linear Combiners , 2007, MCS.