Properties of an invariant set of weights of perceptrons

In this paper, the dynamics of weights of perceptrons are investigated based on the perceptron training algorithm. In particular, the condition that the system map is not injective is derived. Based on the derived condition, an invariant set that results to a bijective invariant map is characterized. Also, it is shown that some weights outside the invariant set will be moved to the invariant set. Hence, the invariant set is attracting. Computer numerical simulation results on various perceptrons with exhibiting various behaviors, such as fixed point behaviors, limit cycle behaviors and chaotic behaviors, are illustrated.

[1]  Hak-Keung Lam,et al.  Global Convergence and Limit Cycle Behavior of Weights of Perceptron , 2008, IEEE Transactions on Neural Networks.

[2]  Mitra Basu,et al.  The fractional correction rule: a new perspective , 1998, Neural Networks.

[3]  Simon Litsyn,et al.  Lattices which are good for (almost) everything , 2005, IEEE Transactions on Information Theory.

[4]  Sankar K. Pal,et al.  Multilayer perceptron, fuzzy sets, and classification , 1992, IEEE Trans. Neural Networks.

[5]  Pierre Moulin,et al.  On error exponents of modulo lattice additive noise channels , 2006, IEEE Transactions on Information Theory.