暂无分享,去创建一个
[1] Risto Miikkulainen,et al. Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.
[2] Shifeng Zhang,et al. DARTS+: Improved Differentiable Architecture Search with Early Stopping , 2019, ArXiv.
[3] Josif Grabocka,et al. NASLib: A Modular and Flexible Neural Architecture Search Library , 2020 .
[4] Maria-Florina Balcan,et al. Geometry-Aware Gradient Algorithms for Neural Architecture Search , 2020, ICLR.
[5] Ameet Talwalkar,et al. Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization , 2016, J. Mach. Learn. Res..
[6] Carl E. Rasmussen,et al. A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..
[7] Kirthevasan Kandasamy,et al. Neural Architecture Search with Bayesian Optimisation and Optimal Transport , 2018, NeurIPS.
[8] Carl E. Rasmussen,et al. Understanding Probabilistic Sparse Gaussian Process Approximations , 2016, NIPS.
[9] Fabio Maria Carlucci,et al. NAS evaluation is frustratingly hard , 2020, ICLR.
[10] Quoc V. Le,et al. Neural Architecture Search with Reinforcement Learning , 2016, ICLR.
[11] Ameet Talwalkar,et al. Random Search and Reproducibility for Neural Architecture Search , 2019, UAI.
[12] Chen Zhang,et al. Deeper Insights into Weight Sharing in Neural Architecture Search , 2020, ArXiv.
[13] Frank Hutter,et al. A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets , 2017, ArXiv.
[14] Quoc V. Le,et al. Efficient Neural Architecture Search via Parameter Sharing , 2018, ICML.
[15] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[16] Tianqi Chen,et al. XGBoost: A Scalable Tree Boosting System , 2016, KDD.
[17] Jiaxu Cui,et al. Deep Neural Architecture Search with Deep Graph Bayesian Optimization , 2019, 2019 IEEE/WIC/ACM International Conference on Web Intelligence (WI).
[18] Aaron Klein,et al. Bayesian Optimization with Robust Bayesian Neural Networks , 2016, NIPS.
[19] Amos Storkey,et al. Neural Architecture Search without Training , 2021, ICML.
[20] Willie Neiswanger,et al. BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search , 2021, AAAI.
[21] Andrew Y. Ng,et al. NGBoost: Natural Gradient Boosting for Probabilistic Prediction , 2019, ICML.
[22] Yi Yang,et al. NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search , 2020, ICLR.
[23] Xiaowen Dong,et al. Neural Architecture Search using Bayesian Optimisation with Weisfeiler-Lehman Kernel , 2020, ArXiv.
[24] Yu Wang,et al. A Surgery of the Neural Architecture Evaluators , 2020 .
[25] Yi Yang,et al. Searching for a Robust Neural Architecture in Four GPU Hours , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[26] Radford M. Neal. Pattern Recognition and Machine Learning , 2007, Technometrics.
[27] Ameet Talwalkar,et al. A System for Massively Parallel Hyperparameter Tuning , 2020, MLSys.
[28] Marius Lindauer,et al. Best Practices for Scientific Research on Neural Architecture Search , 2019, ArXiv.
[29] Prabhat,et al. Scalable Bayesian Optimization Using Deep Neural Networks , 2015, ICML.
[30] Yanan Sun,et al. A Novel Training Protocol for Performance Predictors of Evolutionary Neural Architecture Search Algorithms , 2020, ArXiv.
[31] Mathieu Salzmann,et al. How to Train Your Super-Net: An Analysis of Training Heuristics in Weight-Sharing NAS , 2020, ArXiv.
[32] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[33] Chen Liang,et al. Carbon Emissions and Large Neural Network Training , 2021, ArXiv.
[34] Yiming Yang,et al. DARTS: Differentiable Architecture Search , 2018, ICLR.
[35] James T. Kwok,et al. Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS , 2020, NeurIPS.
[36] Ian R. Lane,et al. Speeding up Hyper-parameter Optimization by Extrapolation of Learning Curves Using Previous Builds , 2017, ECML/PKDD.
[37] Margret Keuper,et al. NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search , 2020, ArXiv.
[38] Tie-Yan Liu,et al. Neural Architecture Optimization , 2018, NeurIPS.
[39] Noah D. Goodman,et al. Pyro: Deep Universal Probabilistic Programming , 2018, J. Mach. Learn. Res..
[40] Thomas Brox,et al. Understanding and Robustifying Differentiable Architecture Search , 2020, ICLR.
[41] Enhong Chen,et al. Neural Architecture Search with GBDT , 2020, ArXiv.
[42] Aaron Klein,et al. Learning Curve Prediction with Bayesian Neural Networks , 2016, ICLR.
[43] Chen Wei,et al. NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search , 2020, ArXiv.
[44] Vijay Vasudevan,et al. Learning Transferable Architectures for Scalable Image Recognition , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[45] Philip H. S. Torr,et al. SNIP: Single-shot Network Pruning based on Connection Sensitivity , 2018, ICLR.
[46] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[47] Michalis K. Titsias,et al. Variational Learning of Inducing Variables in Sparse Gaussian Processes , 2009, AISTATS.
[48] Enhong Chen,et al. Semi-Supervised Neural Architecture Search , 2020, NeurIPS.
[49] Alok Aggarwal,et al. Regularized Evolution for Image Classifier Architecture Search , 2018, AAAI.
[50] Hanxiao Liu,et al. Neural Predictor for Neural Architecture Search , 2019, ECCV.
[51] Hiroaki Kitano,et al. Designing Neural Networks Using Genetic Algorithms with Graph Generation System , 1990, Complex Syst..
[52] Yu Wang,et al. A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS , 2020, ECCV.
[53] Martin Wistuba,et al. A Survey on Neural Architecture Search , 2019, ArXiv.
[54] Xuesen Zhang,et al. EcoNAS: Finding Proxies for Economical Neural Architecture Search , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[55] Frank Hutter,et al. Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..
[56] Lucas Theis,et al. Faster gaze prediction with dense networks and Fisher pruning , 2018, ArXiv.
[57] Martin Jaggi,et al. Evaluating the Search Phase of Neural Architecture Search , 2019, ICLR.
[58] Roger B. Grosse,et al. Picking Winning Tickets Before Training by Preserving Gradient Flow , 2020, ICLR.
[59] W. Neiswanger,et al. A Study on Encodings for Neural Architecture Search , 2020, NeurIPS.
[60] Tie-Yan Liu,et al. LightGBM: A Highly Efficient Gradient Boosting Decision Tree , 2017, NIPS.
[61] Evgeny Burnaev,et al. NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing , 2020, IEEE Access.
[62] Quoc V. Le,et al. Understanding and Simplifying One-Shot Architecture Search , 2018, ICML.
[63] Michael A. Osborne,et al. Raiders of the Lost Architecture: Kernels for Bayesian Optimization in Conditional Parameter Spaces , 2014, 1409.4011.
[64] Ramesh Raskar,et al. Accelerating Neural Architecture Search using Performance Prediction , 2017, ICLR.
[65] Aaron Klein,et al. BOHB: Robust and Efficient Hyperparameter Optimization at Scale , 2018, ICML.
[66] Nicholas D. Lane,et al. Zero-Cost Proxies for Lightweight NAS , 2021, ICLR.
[67] Song Han,et al. ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware , 2018, ICLR.
[68] Frank Hutter,et al. Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves , 2015, IJCAI.
[69] Carl E. Rasmussen,et al. Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.
[70] Mark van der Wilk,et al. Revisiting the Train Loss: an Efficient Performance Estimator for Neural Architecture Search , 2020, ArXiv.
[71] Daniel L. K. Yamins,et al. Pruning neural networks without any data by iteratively conserving synaptic flow , 2020, NeurIPS.
[72] Frank Hutter,et al. NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search , 2020, ICLR.
[73] Andy Liaw,et al. Classification and Regression by randomForest , 2007 .
[74] Andrey Khorlin,et al. Evolutionary-Neural Hybrid Agents for Architecture Search , 2018, ArXiv.
[75] Yiyang Zhao,et al. AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search , 2019, ArXiv.