The Analysis of Regularization in Deep Neural Networks Using Metagraph Approach

The article deals with the overfitting problem in deep neural networks. Finding the model with proper number of parameters matching the simulated process can be a difficult task. There are a range of recommendations how to chose the number of neurons in hidden layers, but most of them don’t always work well in practice. As a result, neural networks work in underfitting or overfitting regime. Therefore in practice complex model is usually chosen and regularization strategies are applied. In this paper, the main regularization techniques for multilayer perceptrons including early stopping and dropout are discussed. Regularization representation using metagraph approach is described. In the creation mode, the metagraph representation of the neural network is created using metagraph agents. In the training mode, the training metagraph is created. Thus, different regularization strategies may be embedded into the training algorithm. The special metagraph agent for dropout strategy is developed. Comparison of different regularization techniques is conducted on CoverType dataset. Results of experiments are analyzed. Advantages of early stopping and dropout regularization strategies are discussed.