The learning problem of multi-layer neural networks

This manuscript considers the learning problem of multi-layer neural networks (MNNs) with an activation function which comes from cellular neural networks. A systematic investigation of the partition of the parameter space is provided. Furthermore, the recursive formula of the transition matrix of an MNN is obtained. By implementing the well-developed tools in the symbolic dynamical systems, the topological entropy of an MNN can be computed explicitly. A novel phenomenon, the asymmetry of a topological diagram that was seen in Ban, Chang, Lin, and Lin (2009) [J. Differential Equations 246, pp. 552-580, 2009], is revealed.

[1]  P. McMullen Convex Sets and Their Applications , 1982 .

[2]  Fukushima Kunihiko Training Multi-layered Neural Network Neocognitron , 2012 .

[3]  Bernard Widrow,et al.  Layered neural nets for pattern recognition , 1988, IEEE Trans. Acoust. Speech Signal Process..

[4]  Carsten Peterson,et al.  A New Method for Mapping Optimization Problems Onto Neural Networks , 1989, Int. J. Neural Syst..

[5]  Musbah M. Aqel,et al.  Pattern recognition using multilayer neural-genetic algorithm , 2003, Neurocomputing.

[6]  Marcello Sanguineti,et al.  Can Two Hidden Layers Make a Difference? , 2013, ICANNGA.

[7]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[8]  Bernard Widrow,et al.  30 years of adaptive neural networks: perceptron, Madaline, and backpropagation , 1990, Proc. IEEE.

[9]  David Haussler,et al.  Unsupervised learning of distributions on binary vectors using two layer networks , 1991, NIPS 1991.

[10]  Kunihiko Fukushima,et al.  Artificial vision by multi-layered neural networks: Neocognitron and its advances , 2013, Neural Networks.

[11]  Chih-Hung Chang,et al.  Spatial complexity in multi-layer cellular neural networks , 2009 .

[12]  Lin-Bao Yang,et al.  Cellular neural networks: theory , 1988 .

[13]  Thomas Serre,et al.  A quantitative theory of immediate visual recognition. , 2007, Progress in brain research.

[14]  Paul E. Utgoff,et al.  Many-Layered Learning , 2002, Neural Computation.

[15]  Chih-Hung Chang,et al.  On the Structure of Two-Layer Cellular Neural Networks , 2013 .

[16]  R. Kannan,et al.  Convex Sets and their Applications , 2006 .

[17]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[18]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[19]  Leon O. Chua,et al.  WORLD SCIENTIFIC SERIES ON NONLINEAR SCIENCE , 2009 .

[20]  Wen-Wei Lin,et al.  Cellular Neural Networks: Local Patterns for General Templates , 2000, Int. J. Bifurc. Chaos.

[21]  Douglas Lind,et al.  An Introduction to Symbolic Dynamics and Coding , 1995 .

[22]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[23]  Chih-Hung Chang,et al.  Diamond in multi-layer cellular neural networks , 2013, Appl. Math. Comput..

[24]  Chih-Hung Chang,et al.  On the structure of multi-layer cellular neural networks , 2012 .

[25]  Jonq Juang,et al.  Cellular Neural Networks: Mosaic Pattern and Spatial Chaos , 2000, SIAM J. Appl. Math..

[26]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..