Combining both ensemble and dynamic classifier selection schemes for prediction of mobile internet subscribers
暂无分享,去创建一个
[1] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[2] R. Mojena,et al. Hierarchical Grouping Methods and Stopping Rules: An Evaluation , 1977, Comput. J..
[3] G. W. Milligan,et al. An examination of procedures for determining the number of clusters in a data set , 1985 .
[4] Fabio Roli,et al. Dynamic classifier selection based on multiple classifier behaviour , 2001, Pattern Recognit..
[5] Fabio Roli,et al. An approach to the automatic design of multiple classifier systems , 2001, Pattern Recognit. Lett..
[6] Jiri Matas,et al. On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[7] Ludmila I. Kuncheva,et al. Clustering-and-selection model for classifier combination , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).
[8] Robert P. W. Duin,et al. Bagging, Boosting and the Random Subspace Method for Linear Classifiers , 2002, Pattern Analysis & Applications.
[9] David M. Sebert,et al. A clustering algorithm for identifying multiple outliers in linear regression , 1998 .
[10] C. J. Whitaker,et al. Ten measures of diversity in classifier ensembles: limits for two classifiers , 2001 .
[11] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.
[12] Kevin W. Bowyer,et al. Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..
[13] J. Neter,et al. Applied Linear Regression Models , 1983 .
[14] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..