Constraint Exploration of Convolutional Network Architectures with Neuroevolution

The effort spent on adapting existing networks to new applications has motivated the automated architecture search. Network structures discovered with evolutionary or other search algorithms have surpassed hand-crafted image classifiers in terms of accuracy. However, these approaches do not constrain certain characteristics like network size, which leads to unnecessary computational effort. Thus, this work shows that generational evolutionary algorithms can be used for a constrained exploration of convolutional network architectures to create a selection of networks for a specific application or target architecture.

[1]  Risto Miikkulainen,et al.  Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.

[2]  D. B. Fogel,et al.  Evolving neural networks , 1990, Biological Cybernetics.

[3]  Qingquan Song,et al.  Auto-Keras: Efficient Neural Architecture Search with Network Morphism , 2018 .

[4]  Xin Yao,et al.  Evolving modular neural networks which generalise well , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[5]  Quoc V. Le,et al.  Neural Architecture Search with Reinforcement Learning , 2016, ICLR.

[6]  Risto Miikkulainen,et al.  Real-time neuroevolution in the NERO video game , 2005, IEEE Transactions on Evolutionary Computation.

[7]  Вадим Васильович Романюк Training Data Expansion and Boosting of Convolutional Neural Networks for Reducing the MNIST Dataset Error Rate , 2016 .

[8]  Li Fei-Fei,et al.  Progressive Neural Architecture Search , 2017, ECCV.

[9]  Jacob Schrum Evolving indirectly encoded convolutional neural networks to play tetris with low-level features , 2018, GECCO.

[10]  Qingquan Song,et al.  Auto-Keras: An Efficient Neural Architecture Search System , 2018, KDD.

[11]  Randal S. Olson,et al.  Evaluation of a Tree-based Pipeline Optimization Tool for Automating Data Science , 2016, GECCO.

[12]  Juan Julián Merelo Guervós,et al.  G-Prop: Global optimization of multilayer perceptrons using GAs , 2000, Neurocomputing.

[13]  Alok Aggarwal,et al.  Regularized Evolution for Image Classifier Architecture Search , 2018, AAAI.

[14]  Song Han,et al.  Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.

[15]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[16]  John R. Koza,et al.  Genetic programming as a means for programming computers by natural selection , 1994 .

[17]  Patrice Y. Simard,et al.  Best practices for convolutional neural networks applied to visual document analysis , 2003, Seventh International Conference on Document Analysis and Recognition, 2003. Proceedings..

[18]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[19]  Nikos Komodakis,et al.  Wide Residual Networks , 2016, BMVC.

[20]  Aaron Klein,et al.  Efficient and Robust Automated Machine Learning , 2015, NIPS.

[21]  Xin Yao,et al.  Towards designing artificial neural networks by evolution , 1998 .