EmProx: Neural Network Performance Estimation For Neural Architecture Search

Common Neural Architecture Search methods generate large amounts of candidate architectures that need training in order to assess their performance and find an optimal architecture. To minimize the search time we use different performance estimation strategies. The effectiveness of such strategies varies in terms of accuracy and fit and query time. This study proposes a new method, EmProx Score (Embedding Proximity Score). Similar to Neural Architecture Optimization (NAO), this method maps candidate architectures to a continuous embedding space using an encoder-decoder framework. The performance of candidates is then estimated using weighted kNN based on the embedding vectors of architectures of which the performance is known. Performance estimations of this method are comparable to the MLP performance predictor used in NAO in terms of accuracy, while being nearly nine times faster to train compared to NAO. Benchmarking against other performance estimation strategies currently used shows similar to better accuracy, while being five up to eighty times faster.

[1]  F. Hutter,et al.  How Powerful are Performance Predictors in Neural Architecture Search? , 2021, NeurIPS.

[2]  Josif Grabocka,et al.  NASLib: A Modular and Flexible Neural Architecture Search Library , 2020 .

[3]  Margret Keuper,et al.  NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search , 2020, ArXiv.

[4]  Mark van der Wilk,et al.  Speedy Performance Estimation for Neural Architecture Search , 2020, NeurIPS.

[5]  Elliot J. Crowley,et al.  Neural Architecture Search without Training , 2020, ICML.

[6]  Enhong Chen,et al.  Semi-Supervised Neural Architecture Search , 2020, NeurIPS.

[7]  Yi Yang,et al.  NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search , 2020, ICLR.

[8]  Hanxiao Liu,et al.  Neural Predictor for Neural Architecture Search , 2019, ECCV.

[9]  Colin White,et al.  BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search , 2019, AAAI.

[10]  Aaron Klein,et al.  NAS-Bench-101: Towards Reproducible Neural Architecture Search , 2019, ICML.

[11]  Tie-Yan Liu,et al.  Neural Architecture Optimization , 2018, NeurIPS.

[12]  Ramesh Raskar,et al.  Practical Neural Network Performance Prediction for Early Stopping , 2017, ArXiv.

[13]  Quoc V. Le,et al.  Large-Scale Evolution of Image Classifiers , 2017, ICML.

[14]  Ramesh Raskar,et al.  Designing Neural Network Architectures using Reinforcement Learning , 2016, ICLR.

[15]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[16]  Mikhail Khodak,et al.  NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search , 2021, ArXiv.