Optimizing the neural network hyperparameters utilizing genetic algorithm

Neural networks (NNs), as one of the most robust and efficient machine learning methods, have been commonly used in solving several problems. However, choosing proper hyperparameters (e.g. the numbers of layers and neurons in each layer) has a significant influence on the accuracy of these methods. Therefore, a considerable number of studies have been carried out to optimize the NN hyperparameters. In this study, the genetic algorithm is applied to NN to find the optimal hyperparameters. Thus, the deep energy method, which contains a deep neural network, is applied first on a Timoshenko beam and a plate with a hole. Subsequently, the numbers of hidden layers, integration points, and neurons in each layer are optimized to reach the highest accuracy to predict the stress distribution through these structures. Thus, applying the proper optimization method on NN leads to significant increase in the NN prediction accuracy after conducting the optimization in various examples.

[1]  Afef Abdelkrim,et al.  Convolutional Neural Network Hyper-Parameters Optimization based on Genetic Algorithms , 2018 .

[2]  Gary G. Yen,et al.  Particle swarm optimization of deep neural networks architectures for image classification , 2019, Swarm Evol. Comput..

[3]  Naif Alajlan,et al.  Multi-Scale Convolutional Neural Network for Remote Sensing Scene Classification , 2018, 2018 IEEE International Conference on Electro/Information Technology (EIT).

[4]  Timon Rabczuk,et al.  An Energy Approach to the Solution of Partial Differential Equations in Computational Mechanics via Machine Learning: Concepts, Implementation and Applications , 2019, Computer Methods in Applied Mechanics and Engineering.

[5]  Mario Vasak,et al.  Deep neural networks for ultra-short-term wind forecasting , 2015, 2015 IEEE International Conference on Industrial Technology (ICIT).

[6]  Myoungho Sunwoo,et al.  Comparative study of the artificial neural network with three hyper-parameter optimization methods for the precise LP-EGR estimation using in-cylinder pressure in a turbocharged GDI engine , 2019, Applied Thermal Engineering.

[7]  Hong Zhu,et al.  Hyper-Parameter Optimization: A Review of Algorithms and Applications , 2020, ArXiv.

[8]  Milan Tuba,et al.  Optimizing Convolutional Neural Network Hyperparameters by Enhanced Swarm Intelligence Metaheuristics , 2020, Algorithms.

[9]  Shahaboddin Shamshirband,et al.  An Intelligent Artificial Neural Network-Response Surface Methodology Method for Accessing the Optimum Biodiesel and Diesel Fuel Blending Conditions in a Diesel Engine from the Viewpoint of Exergy and Energy Analysis , 2018 .

[10]  Petro Liashchynskyi,et al.  Grid Search, Random Search, Genetic Algorithm: A Big Comparison for NAS , 2019, ArXiv.

[11]  Ahmad Afif Supianto,et al.  Hyper Parameter Optimization using Genetic Algorithm on Machine Learning Methods for Online News Popularity Prediction , 2018 .

[12]  Amir Mosavi,et al.  Prediction of significant wave height; comparison between nested grid numerical model, and machine learning models of artificial neural networks, extreme learning and support vector machines , 2020, Engineering Applications of Computational Fluid Mechanics.

[13]  Frank Kirchner,et al.  Optimization of convolutional neural network hyperparameters for automatic classification of adult mosquitoes , 2020, PloS one.

[14]  Wei Xiang,et al.  Neural Network Hyperparameter Tuning based on Improved Genetic Algorithm , 2019, Proceedings of the 2019 8th International Conference on Computing and Pattern Recognition.

[15]  Timon Rabczuk,et al.  A surrogate model for computational homogenization of elastostatics at finite strain using the HDMR-based neural network approximator , 2019, ArXiv.

[16]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[17]  Sukhpal Kaur,et al.  Hyper-parameter optimization of deep learning model for prediction of Parkinson’s disease , 2020, Machine Vision and Applications.

[18]  Qingjin Peng,et al.  The Tabu_Genetic Algorithm: A Novel Method for Hyper-Parameter Optimization of Learning Algorithms , 2019, Electronics.

[19]  Khaled Shaalan,et al.  Speech Recognition Using Deep Neural Networks: A Systematic Review , 2019, IEEE Access.

[20]  Naim Hossain,et al.  Recovery-based error estimation and adaptivity using high-order splines over hierarchical T-meshes , 2018 .

[21]  Yasusi Kanada,et al.  Optimizing neural-network learning rate by using a genetic algorithm with per-epoch mutations , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[22]  Timon Rabczuk,et al.  Transfer learning enhanced physics informed neural network for phase-field modeling of fracture , 2019, Theoretical and Applied Fracture Mechanics.

[23]  Charles E. Augarde,et al.  The use of Timoshenko's exact solution for a cantilever beam in adaptive analysis , 2008 .

[24]  Alicia Troncoso Lora,et al.  Random Hyper-parameter Search-Based Deep Neural Network for Power Consumption Forecasting , 2019, IWANN.

[25]  Nasser R. Sabar,et al.  Optimising Deep Learning by Hyper-heuristic Approach for Classifying Good Quality Images , 2018, ICCS.