The convergence properties of a clipped Hopfield network and its application in the design of keystream generator

We first present a modified Hopfield network, the clipped Hopfield network, with synaptic weights assigned to three values {-1,0,+1}. We give the necessary conditions under which a set of 2n binary vectors can be stored as stable points of the network. We show that in the parallel updating mode, for most of the state vectors, the network will always converge to these 2n stable points. We further demonstrate that these 2n stable points can be divided into two groups, the alpha group and the beta group, each with n stable points. It is shown that the basins of attraction of the stable points in the alpha group are evenly distributed, and the basins of attraction of the stable points in the beta group are also evenly distributed. By ways of application, we show that this class of Hopfield network can be used to build a cryptographically secure keystream generator.

[1]  Rainer A. Rueppel,et al.  Products of linear recurring sequences with maximum complexity , 1987, IEEE Trans. Inf. Theory.

[2]  Solomon W. Golomb,et al.  Shift Register Sequences , 1981 .

[3]  R. Blahut Theory and practice of error control codes , 1983 .

[4]  Amos J. Storkey,et al.  The basins of attraction of a new Hopfield learning rule , 1999, Neural Networks.

[5]  Jehoshua Bruck,et al.  On the number of spurious memories in the Hopfield model , 1990, IEEE Trans. Inf. Theory.

[6]  Jehoshua Bruck On the convergence properties of the Hopfield model , 1990, Proc. IEEE.

[7]  F. Attneave,et al.  The Organization of Behavior: A Neuropsychological Theory , 1949 .

[8]  D. Amit,et al.  Statistical mechanics of neural networks near saturation , 1987 .

[9]  Harald Niederreiter,et al.  Keysystem Sequences with a Good Linear Complexity Profile for Every STrating Point , 1990, EUROCRYPT.

[10]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[11]  Jovan Dj. Golic On the Security of Nonlinear Filter Generators , 1996, FSE.

[12]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[13]  Sangjin Lee,et al.  Conditional Correlation Attack on Nonlinear Filter Generators , 1996, ASIACRYPT.

[14]  Y. Zhan,et al.  Bipolar optical neural network with adaptive threshold , 1992 .

[15]  James L. Massey,et al.  Shift-register synthesis and BCH decoding , 1969, IEEE Trans. Inf. Theory.

[16]  Daniel J. Amit,et al.  Modeling brain function: the world of attractor neural networks, 1st Edition , 1989 .

[17]  Rainer A. Rueppel,et al.  Linear Complexity and Random Sequences , 1985, EUROCRYPT.

[18]  Markus Dichtl On Nonlinear Filter Generators , 1997, FSE.