Estimation of deep neural networks capabilities based on a trigonometric approach

The rapid development of computing machines led to renewed interest in deep neural networks. For years it is known that they have a great possibilities, but to use them new training algorithms are required. The paper shows benefits for deep neural networks usage by analysis of the Fourier series approximation of the activation function for shallow and deep neural network architectures. The proposed approach has been confirmed by experiments.

[1]  Hao Yu,et al.  Neural Network Learning Without Backpropagation , 2010, IEEE Transactions on Neural Networks.

[2]  Narasimhan Sundararajan,et al.  An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[3]  Yoshua Bengio,et al.  Why Does Unsupervised Pre-training Help Deep Learning? , 2010, AISTATS.

[4]  Sepp Hochreiter,et al.  The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[5]  Honglak Lee,et al.  Unsupervised learning of hierarchical representations with convolutional deep belief networks , 2011, Commun. ACM.

[6]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..

[7]  Hao Yu,et al.  Selection of Proper Neural Network Sizes and Architectures—A Comparative Study , 2012, IEEE Transactions on Industrial Informatics.

[8]  Xiao Li,et al.  Machine Learning Paradigms for Speech Recognition: An Overview , 2013, IEEE Transactions on Audio, Speech, and Language Processing.

[9]  Hao Yu,et al.  An Incremental Design of Radial Basis Function Networks , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[10]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[11]  Bogdan M. Wilamowski,et al.  Solving parity-N problems with feedforward neural networks , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[12]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[13]  P. Rozycki,et al.  Dedicated deep neural network architectures and methods for their training , 2015, 2015 IEEE 19th International Conference on Intelligent Engineering Systems (INES).

[14]  B.M. Wilamowski,et al.  Neural network architectures and learning algorithms , 2009, IEEE Industrial Electronics Magazine.