Implementation of Deep Neural Networks for Industry Applications
暂无分享,去创建一个
Janusz Kolbusz | Bogdan M. Wilamowski | Pawel Rózycki | Grzegorz Krzos | B. Wilamowski | Grzegorz Krzos | J. Kolbusz | P. Różycki
[1] José R. Álvarez. Injecting Knowledge into the Solution of the Two-Spiral Problem , 1999, Neural Computing & Applications.
[2] Janusz Kolbusz,et al. Estimation of Deep Neural Networks Capabilities Using Polynomial Approach , 2016, ICAISC.
[3] Janusz Kolbusz,et al. Using Parity-N Problems as a Way to Compare Abilities of Shallow, Very Shallow and Very Deep Architectures , 2015, ICAISC.
[4] Hao Yu,et al. Neural Network Learning Without Backpropagation , 2010, IEEE Transactions on Neural Networks.
[5] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[6] Jürgen Schmidhuber,et al. LSTM recurrent networks learn simple context-free and context-sensitive languages , 2001, IEEE Trans. Neural Networks.
[7] T. Poggio,et al. Deep vs. shallow networks : An approximation theory perspective , 2016, ArXiv.
[8] Jun Du,et al. An Experimental Study on Speech Enhancement Based on Deep Neural Networks , 2014, IEEE Signal Processing Letters.
[9] Chee Kheong Siew,et al. Extreme learning machine: Theory and applications , 2006, Neurocomputing.
[10] Bogdan M. Wilamowski,et al. Solving parity-N problems with feedforward neural networks , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..
[11] Razvan Pascanu,et al. On the Number of Linear Regions of Deep Neural Networks , 2014, NIPS.
[12] Honglak Lee,et al. Unsupervised learning of hierarchical representations with convolutional deep belief networks , 2011, Commun. ACM.
[13] Jianfeng Gao,et al. Modeling Interestingness with Deep Neural Networks , 2014, EMNLP.
[14] Hao Yu,et al. Selection of Proper Neural Network Sizes and Architectures—A Comparative Study , 2012, IEEE Transactions on Industrial Informatics.
[15] Tomaso A. Poggio,et al. When and Why Are Deep Networks Better Than Shallow Ones? , 2017, AAAI.
[16] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[17] P. Rozycki,et al. Estimation of deep neural networks capabilities based on a trigonometric approach , 2016, 2016 IEEE 20th Jubilee International Conference on Intelligent Engineering Systems (INES).
[18] Adam Krzyzak,et al. Adaptation of RBM Learning for Intel MIC Architecture , 2015, ICAISC.
[19] B.M. Wilamowski,et al. Neural network architectures and learning algorithms , 2009, IEEE Industrial Electronics Magazine.
[20] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[21] Yoshua Bengio,et al. Classification using discriminative restricted Boltzmann machines , 2008, ICML '08.
[22] Jürgen Schmidhuber,et al. Deep learning in neural networks: An overview , 2014, Neural Networks.
[23] Lorenzo Rosasco,et al. Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review , 2016, International Journal of Automation and Computing.
[24] Sepp Hochreiter,et al. The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..
[25] Tie-Yan Liu,et al. On the Depth of Deep Neural Networks: A Theoretical View , 2015, AAAI.
[26] Peter Glöckner,et al. Why Does Unsupervised Pre-training Help Deep Learning? , 2013 .
[27] Christopher J. C. Burges,et al. A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.
[28] Yoshua. Bengio,et al. Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..
[29] Punyaphol Horata,et al. Robust extreme learning machine , 2013, Neurocomputing.