暂无分享,去创建一个
Marco Loog | Jan van Gemert | Ziqi Wang | Kanav Anand | J. V. Gemert | M. Loog | Ziqi Wang | Kanav Anand
[1] W. Pirie. Spearman Rank Correlation Coefficient , 2006 .
[2] Quoc V. Le,et al. Efficient Neural Architecture Search via Parameter Sharing , 2018, ICML.
[3] Robert P. Sheridan,et al. Deep Neural Nets as a Method for Quantitative Structure-Activity Relationships , 2015, J. Chem. Inf. Model..
[4] Forrest N. Iandola,et al. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <1MB model size , 2016, ArXiv.
[5] Mario Lucic,et al. Are GANs Created Equal? A Large-Scale Study , 2017, NeurIPS.
[6] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[7] Lars Kotthoff,et al. Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA , 2017, J. Mach. Learn. Res..
[8] Tara N. Sainath,et al. Improvements to Deep Convolutional Neural Networks for LVCSR , 2013, 2013 IEEE Workshop on Automatic Speech Recognition and Understanding.
[9] Camille Couprie,et al. Learning Hierarchical Features for Scene Labeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[10] Kevin Leyton-Brown,et al. An Efficient Approach for Assessing Hyperparameter Importance , 2014, ICML.
[11] Thomas J. Watson,et al. An empirical study of the naive Bayes classifier , 2001 .
[12] H. H. Ku,et al. Contributions to Probability and Statistics, Essays in Honor of Harold Hotelling. , 1961 .
[13] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[14] Jan N. van Rijn,et al. Hyperparameter Importance Across Datasets , 2017, KDD.
[15] Jasper Snoek,et al. Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.
[16] R. Fisher. THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .
[17] Heike Trautmann,et al. Automated Algorithm Selection: Survey and Perspectives , 2018, Evolutionary Computation.
[18] Iryna Gurevych,et al. Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks , 2017, ArXiv.
[19] Frank Hutter,et al. Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..
[20] David D. Cox,et al. Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures , 2013, ICML.
[21] Nando de Freitas,et al. Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.
[22] Tara N. Sainath,et al. Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.
[23] Yadi Zhou,et al. Exploring Tunable Hyperparameters for Deep Neural Networks with Industrial ADME Data Sets. , 2018, Journal of chemical information and modeling.
[24] Chris Dyer,et al. On the State of the Art of Evaluation in Neural Language Models , 2017, ICLR.
[25] Leslie N. Smith,et al. A disciplined approach to neural network hyper-parameters: Part 1 - learning rate, batch size, momentum, and weight decay , 2018, ArXiv.
[26] Alexios Koutsoukas,et al. Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data , 2017, Journal of Cheminformatics.
[27] Ron Kohavi,et al. Automatic Parameter Selection by Minimizing Estimated Error , 1995, ICML.
[28] Yiming Yang,et al. DARTS: Differentiable Architecture Search , 2018, ICLR.
[29] Li Fei-Fei,et al. ImageNet: A large-scale hierarchical image database , 2009, CVPR.
[30] Quoc V. Le,et al. Neural Architecture Search with Reinforcement Learning , 2016, ICLR.
[31] Frank Hutter,et al. Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves , 2015, IJCAI.
[32] Corinna Cortes,et al. Support-Vector Networks , 1995, Machine Learning.
[33] Peter E. Hart,et al. Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.
[34] Jason Weston,et al. Question Answering with Subgraph Embeddings , 2014, EMNLP.