Bagging, Boosting and the Random Subspace Method for Linear Classifiers
暂无分享,去创建一个
[1] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[2] Robert Tibshirani,et al. An Introduction to the Bootstrap , 1994 .
[3] Robert P. W. Duin,et al. Boosting in Linear Discriminant Analysis , 2000, Multiple Classifier Systems.
[4] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.
[5] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[6] Sarunas Raudys,et al. On Dimensionality, Sample Size, Classification Error, and Complexity of Classification Algorithm in Pattern Recognition , 1980, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[7] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[8] Anil K. Jain,et al. 39 Dimensionality and sample size considerations in pattern recognition practice , 1982, Classification, Pattern Recognition and Reduction of Dimensionality.
[9] L. Breiman. Arcing Classifiers , 1998 .
[10] Vladimir N. Vapnik,et al. The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.
[11] Robert P. W. Duin,et al. The Role of Combining Rules in Bagging and Boosting , 2000, SSPR/SPR.
[12] Yoshua Bengio,et al. Boosting Neural Networks , 2000, Neural Computation.
[13] Corinna Cortes,et al. Support-Vector Networks , 1995, Machine Learning.
[14] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.
[15] Robert P. W. Duin,et al. Bagging for linear classifiers , 1998, Pattern Recognit..
[16] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[17] J. Friedman. Regularized Discriminant Analysis , 1989 .
[18] Robert P. W. Duin,et al. Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix , 1998, Pattern Recognit. Lett..
[19] Nathan Intrator,et al. Boosted Mixture of Experts: An Ensemble Learning Scheme , 1999, Neural Computation.
[20] Josef Kittler,et al. Population bias control for bagging k-NN experts , 2001, SPIE Defense + Commercial Sensing.
[21] Eric Bauer,et al. An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.
[22] Robert P. W. Duin,et al. Bagging and the Random Subspace Method for Redundant Feature Spaces , 2001, Multiple Classifier Systems.
[23] Catherine Blake,et al. UCI Repository of machine learning databases , 1998 .
[24] Guozhong An,et al. The Effects of Adding Noise During Backpropagation Training on a Generalization Performance , 1996, Neural Computation.
[25] Tin Kam Ho,et al. Nearest Neighbors in Random Subspaces , 1998, SSPR/SPR.
[26] Robert P. W. Duin,et al. Combining Fisher Linear Discriminants for Dissimilarity Representations , 2000, Multiple Classifier Systems.
[27] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[28] L. Breiman. Random Forests--random Features , 1999 .
[29] Keinosuke Fukunaga,et al. Introduction to Statistical Pattern Recognition , 1972 .