Automated Design of Deep Neural Networks
暂无分享,去创建一个
[1] Ramesh Raskar,et al. Designing Neural Network Architectures using Reinforcement Learning , 2016, ICLR.
[2] El-Ghazali Talbi,et al. Bayesian optimization of variable-size design space problems , 2020, Optimization and Engineering.
[3] Dhabaleswar K. Panda,et al. S-Caffe: Co-designing MPI Runtimes and Caffe for Scalable Deep Learning on Modern GPU Clusters , 2017, PPoPP.
[4] Thomas Paine,et al. GPU Asynchronous Stochastic Gradient Descent to Speed Up Neural Network Training , 2013, ICLR.
[5] Marc'Aurelio Ranzato,et al. Large Scale Distributed Deep Networks , 2012, NIPS.
[6] Dumitru Erhan,et al. Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[7] Frank Hutter,et al. Efficient Multi-Objective Neural Architecture Search via Lamarckian Evolution , 2018, ICLR.
[8] Nicholas Rhinehart,et al. N2N Learning: Network to Network Compression via Policy Gradient Reinforcement Learning , 2017, ICLR.
[9] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[10] Roger A. Pearce,et al. Large-Scale Deep Learning on the YFCC100M Dataset , 2015, ArXiv.
[11] Tie-Yan Liu,et al. Neural Architecture Optimization , 2018, NeurIPS.
[12] Alok Aggarwal,et al. Regularized Evolution for Image Classifier Architecture Search , 2018, AAAI.
[13] Quoc V. Le,et al. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks , 2019, ICML.
[14] Nuno Lourenço,et al. Coevolution of Generative Adversarial Networks , 2019, EvoApplications.
[15] Cory Stephenson,et al. A Comparison of Loss Weighting Strategies for Multi task Learning in Deep Neural Networks , 2019, IEEE Access.
[16] Geoffrey J. Gordon,et al. DeepArchitect: Automatically Designing and Training Deep Architectures , 2017, ArXiv.
[17] Hisao Ishibuchi,et al. Interactive Multiobjective Optimization: A Review of the State-of-the-Art , 2018, IEEE Access.
[18] Julien Cornebise,et al. Weight Uncertainty in Neural Network , 2015, ICML.
[19] Frank Hutter,et al. Neural Architecture Search: A Survey , 2018, J. Mach. Learn. Res..
[20] Quoc V. Le,et al. GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism , 2018, ArXiv.
[21] Bin Wang,et al. Evolving deep neural networks by multi-objective particle swarm optimization for image classification , 2019, GECCO.
[22] Nuno Lourenço,et al. Evolving the Topology of Large Scale Deep Neural Networks , 2018, EuroGP.
[23] Tansel Dökeroglu,et al. Evolutionary parallel extreme learning machines for the data classification problem , 2019, Comput. Ind. Eng..
[24] Yoshua Bengio,et al. Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..
[25] Sergey Ioffe,et al. Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[26] Martin Jaggi,et al. Evaluating the Search Phase of Neural Architecture Search , 2019, ICLR.
[27] Mengjie Zhang,et al. An Experimental Study on Hyper-parameter Optimization for Stacked Auto-Encoders , 2018, 2018 IEEE Congress on Evolutionary Computation (CEC).
[28] Jakob Verbeek,et al. Convolutional Neural Fabrics , 2016, NIPS.
[29] Andrea E. Olsson. Particle Swarm Optimization: Theory, Techniques and Applications , 2010 .
[30] Li Fei-Fei,et al. Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic Image Segmentation , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[31] Dong Wang,et al. Pruning deep neural networks by optimal brain damage , 2014, INTERSPEECH.
[32] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[33] Frank Hutter,et al. Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves , 2015, IJCAI.
[34] Jonathan Gordon,et al. Probabilistic Neural Architecture Search , 2019, ArXiv.
[35] Lorenzo Torresani,et al. MaskConnect: Connectivity Learning by Gradient Descent , 2018, ECCV.
[36] Tianqi Chen,et al. Net2Net: Accelerating Learning via Knowledge Transfer , 2015, ICLR.
[37] Song Han,et al. Path-Level Network Transformation for Efficient Architecture Search , 2018, ICML.
[38] Xin Yao,et al. Evolutionary Generative Adversarial Networks , 2018, IEEE Transactions on Evolutionary Computation.
[39] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[40] Tim Jones. Evolutionary Algorithms, Fitness Landscapes and Search , 1995 .
[41] A. J. Turner,et al. Evolving Artificial Neural Networks using Cartesian Genetic Programming , 2015 .
[42] Ashraf Darwish,et al. A survey of swarm and evolutionary computing approaches for deep learning , 2019, Artificial Intelligence Review.
[43] Xin Yao,et al. Ensemble Learning Using Multi-Objective Evolutionary Algorithms , 2006, J. Math. Model. Algorithms.
[44] Aaron Klein,et al. Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets , 2016, AISTATS.
[45] Li Fei-Fei,et al. Progressive Neural Architecture Search , 2017, ECCV.
[46] Jihoon Yang,et al. Constructive Neural-Network Learning Algorithms for Pattern Classification , 2000 .
[47] Qingfu Zhang,et al. MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.
[48] P. Stadler. Landscapes and their correlation functions , 1996 .
[49] Masoud Daneshtalab,et al. DeepMaker: A multi-objective optimization framework for deep neural networks in embedded systems , 2020, Microprocess. Microsystems.
[50] Wei Wu,et al. Practical Block-Wise Neural Network Architecture Generation , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[51] Kilian Q. Weinberger,et al. Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[52] Vivienne Sze,et al. Designing Energy-Efficient Convolutional Neural Networks Using Energy-Aware Pruning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[53] Aaron Klein,et al. NAS-Bench-101: Towards Reproducible Neural Architecture Search , 2019, ICML.
[54] Vijay Vasudevan,et al. Learning Transferable Architectures for Scalable Image Recognition , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[55] Ye-Hoon Kim,et al. NEMO : Neuro-Evolution with Multiobjective Optimization of Deep Neural Network for Speed and Accuracy , 2017 .
[56] Lukasz Kaiser,et al. One Model To Learn Them All , 2017, ArXiv.
[57] Ludovic Denoyer,et al. Learning Time/Memory-Efficient Deep Architectures with Budgeted Super Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[58] Matthew W. Hoffman,et al. Predictive Entropy Search for Efficient Global Optimization of Black-box Functions , 2014, NIPS.
[59] Julian Togelius,et al. Evolving Memory Cell Structures for Sequence Learning , 2009, ICANN.
[60] Roberto Cipolla,et al. Multi-task Learning Using Uncertainty to Weigh Losses for Scene Geometry and Semantics , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[61] Flavio Vella,et al. Multi-objective autotuning of MobileNets across the full software/hardware stack , 2018, ReQuEST@ASPLOS.
[62] Ameet Talwalkar,et al. Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization , 2016, J. Mach. Learn. Res..
[63] Masanori Suganuma,et al. A genetic programming approach to designing convolutional neural network architectures , 2017, GECCO.
[64] Milan Tuba,et al. Convolutional Neural Network Architecture Design by the Tree Growth Algorithm Framework , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).
[65] Taimoor Akhtar,et al. Efficient Hyperparameter Optimization for Deep Learning Algorithms Using Deterministic RBF Surrogates , 2016, AAAI.
[66] Risto Miikkulainen,et al. Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.
[67] Min Sun,et al. DPP-Net: Device-aware Progressive Search for Pareto-optimal Neural Architectures , 2018, ECCV.
[68] Aaron Klein,et al. Hyperparameter Optimization , 2017, Encyclopedia of Machine Learning and Data Mining.
[69] David D. Cox,et al. Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures , 2013, ICML.
[70] Yoshua Bengio,et al. BinaryConnect: Training Deep Neural Networks with binary weights during propagations , 2015, NIPS.
[71] Kevin Leyton-Brown,et al. Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms , 2012, KDD.
[72] D. Corkill. Blackboard Systems , 1991 .
[73] Matthias W. Seeger,et al. Bayesian Optimization with Tree-structured Dependencies , 2017, ICML.
[74] Tao Wang,et al. Deep learning with COTS HPC systems , 2013, ICML.
[75] Kirthevasan Kandasamy,et al. Neural Architecture Search with Bayesian Optimisation and Optimal Transport , 2018, NeurIPS.
[76] Guang Yang,et al. Neural networks designing neural networks: Multi-objective hyper-parameter optimization , 2016, 2016 IEEE/ACM International Conference on Computer-Aided Design (ICCAD).
[77] Zenghui Wang,et al. Hybrid Stochastic GA-Bayesian Search for Deep Convolutional Neural Network Model Selection , 2019, J. Univers. Comput. Sci..
[78] Yong Yu,et al. Efficient Architecture Search by Network Transformation , 2017, AAAI.
[79] Quoc V. Le,et al. Neural Architecture Search with Reinforcement Learning , 2016, ICLR.
[80] Bernd Bischl,et al. Resampling Methods for Meta-Model Validation with Recommendations for Evolutionary Computation , 2012, Evolutionary Computation.
[81] Enrique Alba,et al. Bayesian Neural Architecture Search using A Training-Free Performance Metric , 2020, Applied Soft Computing.
[82] Sparsh Mittal,et al. A survey of FPGA-based accelerators for convolutional neural networks , 2018, Neural Computing and Applications.
[83] Andrew Wilson,et al. Deep Learning Evolutionary Optimization for Regression of Rotorcraft Vibrational Spectra , 2018, 2018 IEEE/ACM Machine Learning in HPC Environments (MLHPC).
[84] Vladlen Koltun,et al. Multi-Task Learning as Multi-Objective Optimization , 2018, NeurIPS.
[85] Willie Neiswanger,et al. BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search , 2021, AAAI.
[86] Gary G. Yen,et al. Particle swarm optimization of deep neural networks architectures for image classification , 2019, Swarm Evol. Comput..
[87] Timo Aila,et al. Pruning Convolutional Neural Networks for Resource Efficient Transfer Learning , 2016, ArXiv.
[88] Loïc Brevault,et al. How to Deal with Mixed-Variable Optimization Problems: An Overview of Algorithms and Formulations , 2017 .
[89] Xiaogang Wang,et al. Structure Learning for Deep Neural Networks Based on Multiobjective Optimization , 2018, IEEE Transactions on Neural Networks and Learning Systems.
[90] Jakub Nalepa,et al. Memetic evolution of deep neural networks , 2018, GECCO.
[91] Yujie Li,et al. NAS-Unet: Neural Architecture Search for Medical Image Segmentation , 2019, IEEE Access.
[92] Junjie Yan,et al. Practical Network Blocks Design with Q-Learning , 2017, ArXiv.
[93] Dawn Xiaodong Song,et al. Differentiable Neural Network Architecture Search , 2018, ICLR.
[94] Andrew J. Davison,et al. End-To-End Multi-Task Learning With Attention , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[95] Fei Han,et al. Efficient network architecture search via multiobjective particle swarm optimization based on decomposition , 2019, Neural Networks.
[96] Gang Zhang,et al. GP-NAS: Gaussian Process Based Neural Architecture Search , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[97] George Papandreou,et al. Searching for Efficient Multi-Scale Architectures for Dense Image Prediction , 2018, NeurIPS.
[98] Jose Javier Gonzalez Ortiz,et al. What is the State of Neural Network Pruning? , 2020, MLSys.
[99] Chrisantha Fernando,et al. PathNet: Evolution Channels Gradient Descent in Super Neural Networks , 2017, ArXiv.
[100] Diana Marculescu,et al. NeuralPower: Predict and Deploy Energy-Efficient Convolutional Neural Networks , 2017, ArXiv.
[101] Greg Mori,et al. Constraint-Aware Deep Neural Network Compression , 2018, ECCV.
[102] Yoshua Bengio,et al. Algorithms for Hyper-Parameter Optimization , 2011, NIPS.
[103] Dajiang Zhou,et al. CNN-MERP: An FPGA-based memory-efficient reconfigurable processor for forward and backward propagation of convolutional neural networks , 2016, 2016 IEEE 34th International Conference on Computer Design (ICCD).
[104] Alex Krizhevsky,et al. One weird trick for parallelizing convolutional neural networks , 2014, ArXiv.
[105] Yoshua Bengio,et al. Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.
[106] Qingquan Song,et al. Auto-Keras: An Efficient Neural Architecture Search System , 2018, KDD.
[107] Aaron Klein,et al. Towards Automatically-Tuned Neural Networks , 2016, AutoML@ICML.
[108] Simon Fong,et al. How meta-heuristic algorithms contribute to deep learning in the hype of big data analytics , 2018 .
[109] Shinji Watanabe,et al. Structure discovery of deep neural network based on evolutionary algorithms , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[110] Aaron Klein,et al. Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search , 2018, ArXiv.
[111] Qi Yu,et al. DLAU: A Scalable Deep Learning Accelerator Unit on FPGA , 2016, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.
[112] James Higgins,et al. Evolving Deep Recurrent Neural Networks Using Ant Colony Optimization , 2015, EvoCOP.
[113] Frank Hutter,et al. Maximizing acquisition functions for Bayesian optimization , 2018, NeurIPS.
[114] Ming Du,et al. Computer vision algorithms and hardware implementations: A survey , 2019, Integr..
[115] Geoffrey E. Hinton. A Practical Guide to Training Restricted Boltzmann Machines , 2012, Neural Networks: Tricks of the Trade.
[116] Theodore Lim,et al. SMASH: One-Shot Model Architecture Search through HyperNetworks , 2017, ICLR.
[117] Sungroh Yoon,et al. Big/little deep neural network for ultra low power inference , 2015, 2015 International Conference on Hardware/Software Codesign and System Synthesis (CODES+ISSS).
[118] Gang Luo,et al. A review of automatic selection methods for machine learning algorithms and hyper-parameter values , 2016, Network Modeling Analysis in Health Informatics and Bioinformatics.
[119] Jon Atli Benediktsson,et al. Automatic Design of Convolutional Neural Network for Hyperspectral Image Classification , 2019, IEEE Transactions on Geoscience and Remote Sensing.
[120] Alexander Sergeev,et al. Horovod: fast and easy distributed deep learning in TensorFlow , 2018, ArXiv.
[121] Qiang Yang,et al. A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.
[122] D. Sculley,et al. Google Vizier: A Service for Black-Box Optimization , 2017, KDD.
[123] Chongchong Xu,et al. A Power-Efficient Accelerator Based on FPGAs for LSTM Network , 2017, 2017 IEEE International Conference on Cluster Computing (CLUSTER).
[124] Liang Lin,et al. SNAS: Stochastic Neural Architecture Search , 2018, ICLR.
[125] Xiaofang Wang,et al. Learnable Embedding Space for Efficient Neural Architecture Compression , 2019, ICLR.
[126] Ameet Talwalkar,et al. Paleo: A Performance Model for Deep Neural Networks , 2016, ICLR.
[127] J. Emer,et al. Understanding the Limitations of Existing Energy-Efficient Design Approaches for Deep Neural Networks , 2018 .
[128] Ramesh Raskar,et al. Accelerating Neural Architecture Search using Performance Prediction , 2017, ICLR.
[129] Zhen Lin,et al. Implementation and evaluation of deep neural networks (DNN) on mainstream heterogeneous systems , 2014, APSys.
[130] Qingfu Zhang,et al. Pareto Multi-Task Learning , 2019, NeurIPS.
[131] Wei Wei,et al. 2019 Formatting Instructions for Authors Using LaTeX , 2018 .
[132] El-Ghazali Talbi,et al. A unified view of parallel multi-objective evolutionary algorithms , 2019, J. Parallel Distributed Comput..
[133] Risto Miikkulainen,et al. Designing neural networks through neuroevolution , 2019, Nat. Mach. Intell..
[134] Yiyang Zhao,et al. AlphaX: eXploring Neural Architectures with Deep Neural Networks and Monte Carlo Tree Search , 2019, ArXiv.
[135] Torsten Hoefler,et al. Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis. , 2018 .
[136] Aaron Klein,et al. BOHB: Robust and Efficient Hyperparameter Optimization at Scale , 2018, ICML.
[137] Nando de Freitas,et al. Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.
[138] Nuno Lourenço,et al. DENSER: deep evolutionary network structured representation , 2018, Genetic Programming and Evolvable Machines.
[139] Sercan Ömer Arik,et al. Resource-Efficient Neural Architect , 2018, ArXiv.
[140] Bin Wang,et al. Evolving Deep Convolutional Neural Networks by Variable-Length Particle Swarm Optimization for Image Classification , 2018, 2018 IEEE Congress on Evolutionary Computation (CEC).
[141] Li Zhang,et al. Evolving Image Classification Architectures With Enhanced Particle Swarm Optimisation , 2018, IEEE Access.
[142] Gul Muhammad Khan,et al. Signal Reconstruction Using Evolvable Recurrent Neural Networks , 2018, IDEAL.
[143] Yiming Yang,et al. DARTS: Differentiable Architecture Search , 2018, ICLR.
[144] Mohamed Saber Naceur,et al. Reinforcement learning for neural architecture search: A review , 2019, Image Vis. Comput..
[145] Jasper Snoek,et al. Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.
[146] Kaisa Miettinen,et al. Nonlinear multiobjective optimization , 1998, International series in operations research and management science.
[147] Manuel López-Ibáñez,et al. Ant colony optimization , 2010, GECCO '10.
[148] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[149] Eric P. Xing,et al. GeePS: scalable deep learning on distributed GPUs with a GPU-specialized parameter server , 2016, EuroSys.
[150] Marc Snir,et al. Aluminum: An Asynchronous, GPU-Aware Communication Library Optimized for Large-Scale Training of Deep Neural Networks on HPC Systems , 2018, 2018 IEEE/ACM Machine Learning in HPC Environments (MLHPC).
[151] Quoc V. Le,et al. Large-Scale Evolution of Image Classifiers , 2017, ICML.
[152] Oriol Vinyals,et al. Hierarchical Representations for Efficient Architecture Search , 2017, ICLR.
[153] Yuan Yu,et al. TensorFlow: A system for large-scale machine learning , 2016, OSDI.
[154] José Miguel Hernández-Lobato. Designing Neural Network Hardware Accelerators with Decoupled Objective Evaluations , 2016 .
[155] Zbigniew Michalewicz,et al. Evolutionary Algorithms in Engineering Applications , 1997, Springer Berlin Heidelberg.
[156] Mengjie Zhang,et al. A Particle Swarm Optimization-Based Flexible Convolutional Autoencoder for Image Classification , 2017, IEEE Transactions on Neural Networks and Learning Systems.
[157] Kaiming He,et al. Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour , 2017, ArXiv.
[158] Andrew Gordon Wilson,et al. Practical Multi-fidelity Bayesian Optimization for Hyperparameter Tuning , 2019, UAI.
[159] Adam Gaier,et al. Weight Agnostic Neural Networks , 2019, NeurIPS.
[160] Jian Sun,et al. Convolutional neural networks at constrained time cost , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[161] Yaochu Jin,et al. Evolutionary multi-objective generation of recurrent neural network ensembles for time series prediction , 2014, Neurocomputing.
[162] Qi Luo,et al. An Implementation and Improvement of Convolutional Neural Networks on HSA Platform , 2017, ICPCSEE.
[163] Huiqi Li,et al. Differentiable Neural Architecture Search in Equivalent Space with Exploration Enhancement , 2020, NeurIPS.
[164] Sebastian Ruder,et al. An Overview of Multi-Task Learning in Deep Neural Networks , 2017, ArXiv.
[165] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[166] Frank Hutter,et al. A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets , 2017, ArXiv.
[167] Quoc V. Le,et al. Efficient Neural Architecture Search via Parameter Sharing , 2018, ICML.
[168] Jiaxu Cui,et al. Deep Neural Architecture Search with Deep Graph Bayesian Optimization , 2019, 2019 IEEE/WIC/ACM International Conference on Web Intelligence (WI).
[169] Cheng-Lin Liu,et al. DNA computing inspired deep networks design , 2020, Neurocomputing.
[170] Lars Kotthoff,et al. FlexiBO: Cost-Aware Multi-Objective Optimization of Deep Neural Networks , 2020, ArXiv.
[171] Aaron Klein,et al. Bayesian Optimization with Robust Bayesian Neural Networks , 2016, NIPS.
[172] Frédéric Gruau,et al. Genetic Synthesis of Modular Neural Networks , 1993, ICGA.
[173] Xiaogang Wang,et al. Sparsifying Neural Network Connections for Face Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[174] Naiyan Wang,et al. You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[175] Travis Desell,et al. Optimizing Long Short-Term Memory Recurrent Neural Networks Using Ant Colony Optimization to Predict Turbine Engine Vibration , 2017, Appl. Soft Comput..
[176] Forrest N. Iandola,et al. FireCaffe: Near-Linear Acceleration of Deep Neural Network Training on Compute Clusters , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[177] Risto Miikkulainen,et al. From Nodes to Networks: Evolving Recurrent Neural Networks , 2018, ArXiv.
[178] Frank Hutter,et al. CMA-ES for Hyperparameter Optimization of Deep Neural Networks , 2016, ArXiv.
[179] Marc'Aurelio Ranzato,et al. Multi-GPU Training of ConvNets , 2013, ICLR.
[180] Ian D. Reid,et al. Structured Binary Neural Networks for Accurate Image Classification and Semantic Segmentation , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[181] Quoc V. Le,et al. Understanding and Simplifying One-Shot Architecture Search , 2018, ICML.
[182] Michael A. Osborne,et al. Raiders of the Lost Architecture: Kernels for Bayesian Optimization in Conditional Parameter Spaces , 2014, 1409.4011.
[183] Alan L. Yuille,et al. Genetic CNN , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[184] Bilel Derbel,et al. Shake Them All! - Rethinking Selection and Replacement in MOEA/D , 2014, PPSN.
[185] Bin Wang,et al. A Hybrid Differential Evolution Approach to Designing Deep Convolutional Neural Networks for Image Classification , 2018, Australasian Conference on Artificial Intelligence.
[186] Wei Pang,et al. DeepSwarm: Optimising Convolutional Neural Networks using Swarm Intelligence , 2019, UKCI.
[187] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[188] Mark Hasegawa-Johnson,et al. Deep Auto-Encoder Based Multi-Task Learning Using Probabilistic Transcriptions , 2017, INTERSPEECH.
[189] Frank Hutter,et al. Simple And Efficient Architecture Search for Convolutional Neural Networks , 2017, ICLR.
[190] Wojciech Zaremba,et al. An Empirical Exploration of Recurrent Network Architectures , 2015, ICML.
[191] Martin Wistuba,et al. A Survey on Neural Architecture Search , 2019, ArXiv.
[192] Bo Chen,et al. MnasNet: Platform-Aware Neural Architecture Search for Mobile , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[193] Stefan Wermter,et al. Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks , 2018, Int. J. Comput. Intell. Appl..
[194] Nikhil R. Devanur,et al. PipeDream: Fast and Efficient Pipeline Parallel DNN Training , 2018, ArXiv.
[195] Song Han,et al. Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.
[196] Steven R. Young,et al. Optimizing deep learning hyper-parameters through an evolutionary algorithm , 2015, MLHPC@SC.
[197] Changhu Wang,et al. Network Morphism , 2016, ICML.
[198] Ameet Talwalkar,et al. Random Search and Reproducibility for Neural Architecture Search , 2019, UAI.
[199] Yaochu Jin,et al. Surrogate-assisted evolutionary computation: Recent advances and future challenges , 2011, Swarm Evol. Comput..
[200] Alok Aggarwal,et al. Aging Evolution for Image Classifier Architecture Search , 2019, AAAI 2019.
[201] Andrea Vedaldi,et al. Universal representations: The missing link between faces, text, planktons, and cat breeds , 2017, ArXiv.
[202] El-Ghazali Talbi,et al. Metaheuristics - From Design to Implementation , 2009 .
[203] Prabhat,et al. Scalable Bayesian Optimization Using Deep Neural Networks , 2015, ICML.
[204] Pedro Larrañaga,et al. A Review on Estimation of Distribution Algorithms , 2002, Estimation of Distribution Algorithms.
[205] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[206] Masanori Suganuma,et al. Exploiting the Potential of Standard Convolutional Autoencoders for Image Restoration by Evolutionary Search , 2018, ICML.
[207] Andrew Zisserman,et al. Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.
[208] Farinaz Koushanfar,et al. DeLight: Adding Energy Dimension To Deep Neural Networks , 2016, ISLPED.
[209] Yoshua Bengio,et al. FitNets: Hints for Thin Deep Nets , 2014, ICLR.
[210] Chen Lei,et al. Automated Machine Learning , 2021, Cognitive Intelligence and Robotics.