Bootstrap - Inspired Techniques in Computation Intelligence
暂无分享,去创建一个
[1] Robi Polikar,et al. Can AdaBoost.M1 Learn Incrementally? A Comparison to Learn++ Under Different Combination Rules , 2006, ICANN.
[2] Leo Breiman,et al. Pasting Small Votes for Classification in Large Databases and On-Line , 1999, Machine Learning.
[3] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[4] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[5] Ludmila I. Kuncheva,et al. Classifier Ensembles for Changing Environments , 2004, Multiple Classifier Systems.
[6] R. Tibshirani,et al. Improvements on Cross-Validation: The 632+ Bootstrap Method , 1997 .
[7] Geoffrey E. Hinton,et al. Adaptive Mixtures of Local Experts , 1991, Neural Computation.
[8] Jiri Matas,et al. On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[9] David Hinkley,et al. Bootstrap Methods: Another Look at the Jackknife , 2008 .
[10] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.
[11] Marcel J. T. Reinders,et al. Random subspace method for multivariate feature selection , 2006, Pattern Recognit. Lett..
[12] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[13] Robi Polikar,et al. An Ensemble Approach for Incremental Learning in Nonstationary Environments , 2007, MCS.
[14] B. Efron. Estimating the Error Rate of a Prediction Rule: Improvement on Cross-Validation , 1983 .
[15] C. D. Nealy,et al. Estimation of error rate for linear discriminant functions by resampling: non-Gaussian populations , 1988 .
[16] P. Boland. Majority Systems and the Condorcet Jury Theorem , 1989 .
[17] Vasant Honavar,et al. Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.
[18] B.V. Dasarathy,et al. A composite classifier system design: Concepts and methodology , 1979, Proceedings of the IEEE.
[19] Robi Polikar,et al. An Ensemble-Based Incremental Learning Approach to Data Fusion , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).
[20] R. Schapire. The Strength of Weak Learnability , 1990, Machine Learning.
[21] D. J. Newman,et al. UCI Repository of Machine Learning Database , 1998 .
[22] Ron Kohavi,et al. A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.
[23] Robi Polikar,et al. Random Feature Subset Selection for Ensemble Based Classification of Data with Missing Features , 2007, MCS.
[24] Ludmila I. Kuncheva,et al. A Theoretical Study on Six Classifier Fusion Strategies , 2002, IEEE Trans. Pattern Anal. Mach. Intell..
[25] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[26] David H. Wolpert,et al. Stacked generalization , 1992, Neural Networks.
[27] Gutti Jogesh Babu. Bootstrap Techniques for Signal Processing , 2005, Technometrics.
[28] Stephen Grossberg,et al. Nonlinear neural networks: Principles, mechanisms, and architectures , 1988, Neural Networks.
[29] Anil K. Jain,et al. Bootstrap Techniques for Error Estimation , 1987, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[30] Robert A. Jacobs,et al. Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.