Some Insights Into Convolutional Neural Networks

As a kind of deep learning model, convolutional neural networks (CNNs) have greatly boosted the state-of-the-art performance and have found their successful applications in many fields, such as computer version, pattern recognition, natural language processing, etc. Many distinguished CNN models, for example, AlexNet, Google inception net, VGGNet, and so on, have been developed for various tasks. In this paper, we provide some insights into convolutional neural networks in three aspects: activation function, convolution operation, pooling operation, and we also discuss the training of CNNs. The insights presented in this paper will provide the potential readers with further understanding of CNNs.

[1]  Shan Sung Liew,et al.  Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems , 2016, Neurocomputing.

[2]  Joan Bruna,et al.  Signal recovery from Pooling Representations , 2013, ICML.

[3]  Fei-Fei Li,et al.  ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[6]  Ting Liu,et al.  Recent advances in convolutional neural networks , 2015, Pattern Recognit..

[7]  Khalil-HaniMohamed,et al.  Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems , 2016 .

[8]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[9]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[10]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[11]  Zhihua Wei,et al.  Mixed Pooling for Convolutional Neural Networks , 2014, RSKT.

[12]  Pierre Baldi,et al.  Understanding Dropout , 2013, NIPS.

[13]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[14]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[15]  Misha Denil,et al.  Noisy Activation Functions , 2016, ICML.

[16]  Pierre Baldi,et al.  The dropout learning algorithm , 2014, Artif. Intell..

[17]  Rob Fergus,et al.  Stochastic Pooling for Regularization of Deep Convolutional Neural Networks , 2013, ICLR.

[18]  Xiaodong Gu,et al.  Towards dropout training for convolutional neural networks , 2015, Neural Networks.

[19]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).