Collaborative hyperparameter tuning
暂无分享,去创建一个
Michèle Sebag | Balázs Kégl | Mátyás Brendel | Rémi Bardenet | B. Kégl | M. Sebag | R. Bardenet | M. Brendel
[1] Yoram Singer,et al. Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.
[2] Donald R. Jones,et al. A Taxonomy of Global Optimization Methods Based on Response Surfaces , 2001, J. Glob. Optim..
[3] Thorsten Joachims,et al. Optimizing search engines using clickthrough data , 2002, KDD.
[4] Wei Chu,et al. Preference learning with Gaussian processes , 2005, ICML.
[5] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[6] A. E. Eiben,et al. Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.
[7] D. Lizotte. Practical bayesian optimization , 2008 .
[8] Frank Hutter,et al. Automated configuration of algorithms for solving hard computational problems , 2009 .
[9] F. Hutter,et al. ParamILS: An Automatic Algorithm Configuration Framework , 2009, J. Artif. Intell. Res..
[10] Eric Walter,et al. An informational approach to the global optimization of expensive-to-evaluate functions , 2006, J. Glob. Optim..
[11] Carl E. Rasmussen,et al. Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.
[12] Balázs Kégl,et al. Boosting products of base classifiers , 2009, ICML '09.
[13] David D. Cox,et al. A High-Throughput Screening Approach to Discovering Good Forms of Biologically Inspired Visual Representation , 2009, PLoS Comput. Biol..
[14] Andreas Krause,et al. Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting , 2009, IEEE Transactions on Information Theory.
[15] Yoshua Bengio,et al. Algorithms for Hyper-Parameter Optimization , 2011, NIPS.
[16] Marc Schoenauer,et al. Instance-based parameter tuning for evolutionary AI planning , 2011, GECCO.
[17] Honglak Lee,et al. An Analysis of Single-Layer Networks in Unsupervised Feature Learning , 2011, AISTATS.
[18] Yoshua Bengio,et al. Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..
[19] Jasper Snoek,et al. Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.
[20] François Laviolette,et al. Bayesian Comparison of Machine Learning Algorithms on Single and Multiple Datasets , 2012, AISTATS.
[21] Kevin Leyton-Brown,et al. Auto-WEKA: Automated Selection and Hyper-Parameter Optimization of Classification Algorithms , 2012, ArXiv.
[22] Balázs Kégl,et al. MULTIBOOST: A Multi-purpose Boosting Package , 2012, J. Mach. Learn. Res..