Prediction based on averages over automatically induced learners: ensemble methods and Bayesian techniques
暂无分享,去创建一个
[1] F. Wilcoxon. Individual Comparisons by Ranking Methods , 1945 .
[2] Milton Abramowitz,et al. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables , 1964 .
[3] M. Aizerman,et al. Theoretical Foundations of the Potential Function Method in Pattern Recognition Learning , 1964 .
[4] M. Abramowitz,et al. Handbook of Mathematical Functions With Formulas, Graphs and Mathematical Tables (National Bureau of Standards Applied Mathematics Series No. 55) , 1965 .
[5] David S. Johnson,et al. Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .
[6] S. Adler. Over-relaxation method for the Monte Carlo evaluation of the partition function for multiquadratic actions , 1981 .
[7] Yoav Freund,et al. Boosting a weak learning algorithm by majority , 1995, COLT '90.
[8] John F. Kolen,et al. Backpropagation is Sensitive to Initial Conditions , 1990, Complex Syst..
[9] R. T. Cox. Probability, frequency and reasonable expectation , 1990 .
[10] Lars Kai Hansen,et al. Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..
[11] Ronald L. Rivest,et al. Introduction to Algorithms , 1990 .
[12] J. Friedman. Multivariate adaptive regression splines , 1990 .
[13] Anders Krogh,et al. A Simple Weight Decay Can Improve Generalization , 1991, NIPS.
[14] Trevor Hastie,et al. Statistical Models in S , 1991 .
[15] J. Kittler,et al. Multistage pattern recognition with reject option , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.
[16] L. Cooper,et al. When Networks Disagree: Ensemble Methods for Hybrid Neural Networks , 1992 .
[17] Adam Krzyżak,et al. Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..
[18] Robert Tibshirani,et al. An Introduction to the Bootstrap , 1994 .
[19] Ludmila I. Kuncheva,et al. Genetic Algorithm for Feature Selection for Parallel Classifiers , 1993, Inf. Process. Lett..
[20] Sargur N. Srihari,et al. Decision Combination in Multiple Classifier Systems , 1994, IEEE Trans. Pattern Anal. Mach. Intell..
[21] Anders Krogh,et al. Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.
[22] Cullen Schaffer,et al. A Conservation Law for Generalization Performance , 1994, ICML.
[23] W. Press,et al. Numerical Recipes in Fortran: The Art of Scientific Computing.@@@Numerical Recipes in C: The Art of Scientific Computing. , 1994 .
[24] M. Kamel,et al. Voting schemes for cooperative neural network classifiers , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.
[25] Christopher M. Bishop,et al. Neural networks for pattern recognition , 1995 .
[26] David P. Williamson,et al. Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming , 1995, JACM.
[27] Thomas G. Dietterich,et al. Solving Multiclass Learning Problems via Error-Correcting Output Codes , 1994, J. Artif. Intell. Res..
[28] Anders Krogh,et al. Learning with ensembles: How overfitting can be useful , 1995, NIPS.
[29] Naonori Ueda,et al. Generalization error of ensemble estimators , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).
[30] Brian D. Ripley,et al. Pattern Recognition and Neural Networks , 1996 .
[31] Amanda J. C. Sharkey,et al. On Combining Artificial Neural Nets , 1996, Connect. Sci..
[32] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[33] J. Ross Quinlan,et al. Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.
[34] Bruce E. Rosen,et al. Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..
[35] David Barber,et al. Gaussian Processes for Bayesian Classification via Hybrid Monte Carlo , 1996, NIPS.
[36] R. Tibshirani,et al. Combining Estimates in Regression and Classification , 1996 .
[37] Kagan Tumer,et al. Error Correlation and Error Reduction in Ensemble Classifiers , 1996, Connect. Sci..
[38] David H. Wolpert,et al. The Lack of A Priori Distinctions Between Learning Algorithms , 1996, Neural Computation.
[39] L. Breiman. OUT-OF-BAG ESTIMATION , 1996 .
[40] Kevin W. Bowyer,et al. Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..
[41] Thomas G. Dietterich. Machine-Learning Research , 1997, AI Mag..
[42] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[43] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[44] Noel E. Sharkey,et al. Combining diverse neural nets , 1997, The Knowledge Engineering Review.
[45] Harris Drucker,et al. Improving Regressors using Boosting Techniques , 1997, ICML.
[46] Radford M. Neal. Monte Carlo Implementation of Gaussian Process Models for Bayesian Regression and Classification , 1997, physics/9701026.
[47] Ching Y. Suen,et al. Application of majority voting to pattern recognition: an analysis of its behavior and performance , 1997, IEEE Trans. Syst. Man Cybern. Part A.
[48] Thomas G. Dietterich,et al. Pruning Adaptive Boosting , 1997, ICML.
[49] Robert E. Schapire,et al. Using output codes to boost multiclass learning problems , 1997, ICML.
[50] E. George,et al. APPROACHES FOR BAYESIAN VARIABLE SELECTION , 1997 .
[51] Yali Amit,et al. Shape Quantization and Recognition with Randomized Trees , 1997, Neural Computation.
[52] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[53] Jiri Matas,et al. On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[54] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[55] David Barber,et al. Bayesian Classification With Gaussian Processes , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[56] Amanda J. C. Sharkey,et al. Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems , 1999 .
[57] Alexey Tsymbal,et al. A Dynamic Integration Algorithm for an Ensemble of Classifiers , 1999, ISMIS.
[58] J. Mesirov,et al. Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. , 1999, Science.
[59] D. Opitz,et al. Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..
[60] U. Alon,et al. Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. , 1999, Proceedings of the National Academy of Sciences of the United States of America.
[61] Xin Yao,et al. Ensemble learning via negative correlation , 1999, Neural Networks.
[62] Andrew R. Webb,et al. Statistical Pattern Recognition , 1999 .
[63] Nathan Intrator,et al. Boosting Regression Estimators , 1999, Neural Computation.
[64] Ian H. Witten,et al. Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .
[65] Hagai Attias,et al. A Variational Bayesian Framework for Graphical Models , 1999 .
[66] Leo Breiman,et al. Prediction Games and Arcing Algorithms , 1999, Neural Computation.
[67] Nikunj C. Oza,et al. Decimated input ensembles for improved generalization , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).
[68] Roger E Bumgarner,et al. Comparative hybridization of an array of 21,500 ovarian cDNAs for the discovery of genes overexpressed in ovarian carcinomas. , 1999, Gene.
[69] Thomas Richardson,et al. Boosting methodology for regression problems , 1999, AISTATS.
[70] Christino Tamon,et al. On the Boosting Pruning Problem , 2000, ECML.
[71] Alexey Tsymbal,et al. Bagging and Boosting with Dynamic Integration of Classifiers , 2000, PKDD.
[72] Thomas G. Dietterich. Ensemble Methods in Machine Learning , 2000, Multiple Classifier Systems.
[73] Fabio Roli,et al. Design of effective multiple classifier systems by clustering of classifiers , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.
[74] David J. C. MacKay,et al. Variational Gaussian process classifiers , 2000, IEEE Trans. Neural Networks Learn. Syst..
[75] J. Friedman. Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .
[76] Pedro M. Domingos. A Unifeid Bias-Variance Decomposition and its Applications , 2000, ICML.
[77] Vladimir N. Vapnik,et al. The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.
[78] Ole Winther,et al. Gaussian Processes for Classification: Mean-Field Algorithms , 2000, Neural Computation.
[79] Robert P. W. Duin,et al. Experiments with Classifier Combining Rules , 2000, Multiple Classifier Systems.
[80] Fabio Roli,et al. Dynamic classifier selection based on multiple classifier behaviour , 2001, Pattern Recognit..
[81] Tom Minka,et al. Expectation Propagation for approximate Bayesian inference , 2001, UAI.
[82] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[83] Trevor Hastie,et al. The Elements of Statistical Learning , 2001 .
[84] James C. Bezdek,et al. Decision templates for multiple classifier fusion: an experimental comparison , 2001, Pattern Recognit..
[85] Colin Campbell,et al. Bayes Point Machines , 2001, J. Mach. Learn. Res..
[86] Edward R. Dougherty,et al. Small Sample Issues for Microarray-Based Classification , 2001, Comparative and functional genomics.
[87] Ole Winther,et al. TAP Gibbs Free Energy, Belief Propagation and Sparsity , 2001, NIPS.
[88] Olivier Debeir,et al. Limiting the Number of Trees in Random Forests , 2001, Multiple Classifier Systems.
[89] Zoran Obradovic,et al. Effective pruning of neural network classifier ensembles , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).
[90] Shlomo Argamon,et al. Arbitrating Among Competing Classifiers Using Learned Referees , 2001, Knowledge and Information Systems.
[91] Tom Minka,et al. A family of algorithms for approximate Bayesian inference , 2001 .
[92] D K Smith,et al. Numerical Optimization , 2001, J. Oper. Res. Soc..
[93] E. Dougherty,et al. Gene-expression profiles in hereditary breast cancer. , 2001, The New England journal of medicine.
[94] R. Spang,et al. Predicting the clinical status of human breast cancer by using gene expression profiles , 2001, Proceedings of the National Academy of Sciences of the United States of America.
[95] Salvatore J. Stolfo,et al. Cost Complexity-Based Pruning of Ensemble Classifiers , 2001, Knowledge and Information Systems.
[96] Fabio Roli,et al. An approach to the automatic design of multiple classifier systems , 2001, Pattern Recognit. Lett..
[97] Yudong D. He,et al. Gene expression profiling predicts clinical outcome of breast cancer , 2002, Nature.
[98] Yi Li,et al. Bayesian automatic relevance determination algorithms for classifying gene expression data. , 2002, Bioinformatics.
[99] Jiawei Zhang,et al. An improved rounding method and semidefinite programming relaxation for graph partition , 2002, Math. Program..
[100] David J. Spiegelhalter,et al. VIBES: A Variational Inference Engine for Bayesian Networks , 2002, NIPS.
[101] Gunnar Rätsch,et al. An Introduction to Boosting and Leveraging , 2002, Machine Learning Summer School.
[102] Philip S. Yu,et al. Pruning and dynamic scheduling of cost-sensitive ensembles , 2002, AAAI/IAAI.
[103] Peter Bühlmann,et al. Supervised clustering of genes , 2002, Genome Biology.
[104] Giorgio Valentini,et al. Ensembles of Learning Machines , 2002, WIRN.
[105] Lehel Csató,et al. Sparse On-Line Gaussian Processes , 2002, Neural Computation.
[106] Johannes Fürnkranz,et al. Round Robin Classification , 2002, J. Mach. Learn. Res..
[107] S. Dudoit,et al. Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data , 2002 .
[108] E. Lander,et al. Gene expression correlates of clinical prostate cancer behavior. , 2002, Cancer cell.
[109] Geoffrey J McLachlan,et al. Selection bias in gene extraction on the basis of microarray gene-expression data , 2002, Proceedings of the National Academy of Sciences of the United States of America.
[110] R. Tibshirani,et al. Diagnosis of multiple cancer types by shrunken centroids of gene expression , 2002, Proceedings of the National Academy of Sciences of the United States of America.
[111] Wei Tang,et al. Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..
[112] Robert P. W. Duin,et al. Bagging, Boosting and the Random Subspace Method for Linear Classifiers , 2002, Pattern Analysis & Applications.
[113] Neil D. Lawrence,et al. Fast Sparse Gaussian Process Methods: The Informative Vector Machine , 2002, NIPS.
[114] Christopher M. Bishop,et al. Bayesian Hierarchical Mixtures of Experts , 2002, UAI.
[115] Hyun-Chul Kim,et al. Constructing support vector machine ensemble , 2003, Pattern Recognit..
[116] Kagan Tumer,et al. Input decimated ensembles , 2003, Pattern Analysis & Applications.
[117] Robert P. W. Duin,et al. Limits on the majority vote accuracy in classifier fusion , 2003, Pattern Analysis & Applications.
[118] Matthias W. Seeger,et al. Bayesian Gaussian process models : PAC-Bayesian generalisation error bounds and sparse approximations , 2003 .
[119] E. Lander,et al. A molecular signature of metastasis in primary solid tumors , 2003, Nature Genetics.
[120] Carla E. Brodley,et al. Random Projection for High Dimensional Data Clustering: A Cluster Ensemble Approach , 2003, ICML.
[121] Mykola Pechenizkiy,et al. Dynamic Integration of Classifiers in the Space of Principal Components , 2003, ADBIS.
[122] Marina Vannucci,et al. Gene selection: a Bayesian variable selection approach , 2003, Bioinform..
[123] Wei Tang,et al. Selective Ensemble of Decision Trees , 2003, RSFDGrC.
[124] Lorenza Saitta,et al. Monte Carlo theory as an explanation of bagging and boosting , 2003, IJCAI 2003.
[125] Tom Heskes,et al. Clustering ensembles of neural network models , 2003, Neural Networks.
[126] Philip S. Yu,et al. Mining concept-drifting data streams using ensemble classifiers , 2003, KDD '03.
[127] D. Edwards,et al. Statistical Analysis of Gene Expression Microarray Data , 2003 .
[128] P. Bühlmann. Bagging, subagging and bragging for improving some prediction algorithms , 2003 .
[129] Gunnar Rätsch,et al. Soft Margins for AdaBoost , 2001, Machine Learning.
[130] Leo Breiman,et al. Randomizing Outputs to Increase Prediction Accuracy , 2000, Machine Learning.
[131] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[132] Grigorios Tsoumakas,et al. Effective Voting of Heterogeneous Classifiers , 2004, ECML.
[133] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.
[134] Stephen T. C. Wong,et al. Cancer classification and prediction using logistic regression with Bayesian gene selection , 2004, J. Biomed. Informatics.
[135] Rich Caruana,et al. Ensemble selection from libraries of models , 2004, ICML.
[136] Bani K. Mallick,et al. Gene selection using a two-level hierarchical Bayesian model , 2004, Bioinform..
[137] Tom Bylander,et al. Estimating Generalization Error on Two-Class Datasets Using Out-of-Bag Estimates , 2002, Machine Learning.
[138] João Gama,et al. Cascade Generalization , 2000, Machine Learning.
[139] Geoffrey I. Webb,et al. MultiBoosting: A Technique for Combining Boosting and Wagging , 2000, Machine Learning.
[140] Lawrence Carin,et al. Joint Classifier and Feature Optimization for Comprehensive Cancer Diagnosis Using Gene Expression Data , 2004, J. Comput. Biol..
[141] Saso Dzeroski,et al. Combining Classifiers with Meta Decision Trees , 2003, Machine Learning.
[142] Marcel Dettling,et al. BagBoosting for tumor classification with gene expression data , 2004, Bioinform..
[143] David J. C. MacKay,et al. Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.
[144] Jason Weston,et al. Gene Selection for Cancer Classification using Support Vector Machines , 2002, Machine Learning.
[145] Eric Bauer,et al. An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.
[146] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[147] Cigdem Demir,et al. Cost-conscious classifier ensembles , 2005, Pattern Recognit. Lett..
[148] Fabio Roli,et al. A theoretical and experimental analysis of linear combiners for multiple classifier systems , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[149] Gunnar Rätsch,et al. Efficient Margin Maximizing with Boosting , 2005, J. Mach. Learn. Res..
[150] Bogdan Gabrys,et al. Classifier selection for majority voting , 2005, Inf. Fusion.
[151] Timothy S Gardner,et al. Reverse-engineering transcription control networks. , 2005, Physics of life reviews.
[152] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[153] Carl E. Rasmussen,et al. Assessing Approximate Inference for Binary Gaussian Process Classification , 2005, J. Mach. Learn. Res..
[154] Robert P. W. Duin,et al. Combining Feature Subsets in Feature Selection , 2005, Multiple Classifier Systems.
[155] L. Breiman. Stacked Regressions , 1996, Machine Learning.
[156] Subhash C. Bagui,et al. Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.
[157] P. Hall,et al. Properties of bagged nearest neighbour classifiers , 2005 .
[158] Ramón Díaz-Uriarte,et al. Gene selection and classification of microarray data using random forest , 2006, BMC Bioinformatics.
[159] Peter Tiño,et al. Managing Diversity in Regression Ensembles , 2005, J. Mach. Learn. Res..
[160] Gonzalo Martínez-Muñoz,et al. Switching class labels to generate classification ensembles , 2005, Pattern Recognit..
[161] Lawrence O. Hall,et al. Ensemble diversity measures and their application to thinning , 2004, Inf. Fusion.
[162] Jae Won Lee,et al. An extensive comparison of recent classification tools applied to microarray data , 2004, Comput. Stat. Data Anal..
[163] Igor V. Tetko,et al. Gene selection from microarray data for cancer classification - a machine learning approach , 2005, Comput. Biol. Chem..
[164] Ole Winther,et al. Expectation Consistent Approximate Inference , 2005, J. Mach. Learn. Res..
[165] Lefteris Angelis,et al. Selective fusion of heterogeneous classifiers , 2005, Intell. Data Anal..
[166] Josef Kittler,et al. Combining classifiers: A theoretical framework , 1998, Pattern Analysis and Applications.
[167] Anand M. Narasimhamurthy. Theoretical bounds of majority voting performance for a binary classification problem , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[168] Pierre Geurts,et al. Extremely randomized trees , 2006, Machine Learning.
[169] Huanhuan Chen,et al. A Probabilistic Ensemble Pruning Algorithm , 2006, Sixth IEEE International Conference on Data Mining - Workshops (ICDMW'06).
[170] Aníbal R. Figueiras-Vidal,et al. Boosting by weighting critical and erroneous samples , 2006, Neurocomputing.
[171] William Nick Street,et al. Ensemble Pruning Via Semi-definite Programming , 2006, J. Mach. Learn. Res..
[172] Gavin C. Cawley,et al. Gene Selection in Cancer Classification using Sparse Logistic Regression with Bayesian Regularisation , 2006 .
[173] Juan José Rodríguez Diez,et al. Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[174] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[175] Christopher M. Bishop,et al. Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .
[176] Daniel Hernández-Lobato,et al. Pruning Adaptive Boosting Ensembles by Means of a Genetic Algorithm , 2006, IDEAL.
[177] Gonzalo Martínez-Muñoz,et al. Pruning in ordered bagging ensembles , 2006, ICML.
[178] Grigorios Tsoumakas,et al. Ensemble Pruning Using Reinforcement Learning , 2006, SETN.
[179] Daniel Hernández-Lobato,et al. Pruning in Ordered Regression Bagging Ensembles , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.
[180] Michael I. Jordan,et al. Variational inference for Dirichlet process mixtures , 2006 .
[181] Gonzalo Martínez-Muñoz,et al. Using boosting to prune bagging ensembles , 2007, Pattern Recognit. Lett..
[182] Sotiris B. Kotsiantis,et al. Machine learning: a review of classification and combining techniques , 2006, Artificial Intelligence Review.
[183] Anne M. P. Canuto,et al. Investigating the influence of the choice of the ensemble members in accuracy and diversity of selection-based and fusion-based methods for ensembles , 2007, Pattern Recognit. Lett..
[184] Patrick P. K. Chan,et al. Neural network ensemble pruning using sensitivity measure in web applications , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.
[185] Ludmila I. Kuncheva,et al. A stability index for feature selection , 2007, Artificial Intelligence and Applications.
[186] Florian Steinke,et al. Bayesian Inference and Optimal Design in the Sparse Linear Model , 2007, AISTATS.
[187] Daniel Hernández-Lobato,et al. Selection of Decision Stumps in Bagging Ensembles , 2007, ICANN.
[188] Jean-Philippe Thiran,et al. Information Theoretic Combination of Classifiers with Application to AdaBoost , 2007, MCS.
[189] Lawrence O. Hall,et al. A Comparison of Decision Tree Ensemble Creation Techniques , 2007 .
[190] Gene H. Golub,et al. Methods for modifying matrix factorizations , 1972, Milestones in Matrix Computation.
[191] Daniel Hernández-Lobato,et al. Out of Bootstrap Estimation of Generalization Error Curves in Bagging Ensembles , 2007, IDEAL.
[192] Hongzhe Li,et al. In Response to Comment on "Network-constrained regularization and variable selection for analysis of genomic data" , 2008, Bioinform..
[193] Thomas G. Dietterich,et al. Machine Learning Bias, Statistical Bias, and Statistical Variance of Decision Tree Algorithms , 2008 .
[194] Grigorios Tsoumakas,et al. Greedy regression ensemble selection: Theory and an application to water quality prediction , 2008, Inf. Sci..
[195] Fabio Roli,et al. A Theoretical Analysis of Bagging as a Linear Combination of Classifiers , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[196] Aníbal R. Figueiras-Vidal,et al. A Dynamically Adjusted Mixed Emphasis Method for Building Boosting Ensembles , 2008, IEEE Transactions on Neural Networks.
[197] Daniel Hernández-Lobato,et al. Bayes Machines for binary classification , 2008, Pattern Recognit. Lett..
[198] Daniel Hernández-Lobato. Sparse Bayes Machines for Binary Classification , 2008, ICANN.
[199] Daniel Hernández-Lobato,et al. An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[200] Laurent Heutte,et al. Influence of Hyperparameters on Random Forest Accuracy , 2009, MCS.
[201] Yehuda Koren,et al. The BellKor Solution to the Netflix Grand Prize , 2009 .
[202] Wei Pan,et al. Network-based support vector machine for classification of microarray samples , 2009, BMC Bioinformatics.
[203] Yuan Qi,et al. Virtual Vector Machine for Bayesian Online Classification , 2009, UAI.
[204] Antanas Verikas,et al. A feature selection technique for generation of classification committees and its application to categorization of laryngeal images , 2009, Pattern Recognit..
[205] Yoav Freund,et al. A more robust boosting algorithm , 2009, 0905.2138.
[206] Daniel Hernández-Lobato,et al. Expectation Propagation for microarray data classification , 2010, Pattern Recognit. Lett..
[207] William N. Venables,et al. Modern Applied Statistics with S , 2010 .
[208] Chih-Jen Lin,et al. LIBSVM: A library for support vector machines , 2011, TIST.