Projection Rule for Rotor Hopfield Neural Networks

A rotor Hopfield neural network (RHNN) is an extension of a complex-valued Hopfield neural network (CHNN). RHNNs have some excellent properties. For example, the storage capacity of an RHNN is twice that of a CHNN. The most important property of an RHNN is that it does not store rotated patterns of training patterns, unlike CHNNs, which have less noise robustness because they store rotated patterns. However, conventional learning methods for RHNNs, such as Hebbian learning rule and gradient descent learning rules, present difficulties with regard to, for example, storage capacity, noise robustness, and learning time. In this paper, we propose a projection rule for RHNN and demonstrate that the noise robustness of RHNN is better than that of CHNN. The proposed algorithm improves the noise robustness of RHNN. As the number of training patterns increases, the noise robustness of CHNN rapidly deteriorates. On the other hand, the noise robustness of RHNN reduces less rapidly for the same case. Moreover, RHNN can easily recover from rotated patterns, unlike CHNN. We show this ability by computer simulation.

[1]  Noest Discrete-state phasor neural networks. , 1988, Physical review. A, General physics.

[2]  Akira Hirose,et al.  Complex-Valued Neural Networks (Studies in Computational Intelligence) , 2006 .

[3]  Akira Hirose,et al.  Complex-Valued Neural Networks: Advances and Applications , 2013 .

[4]  K. Aihara,et al.  Complex-Valued Multistate Associative Memory With Nonlinear Multilevel Functions for Gray-Level Image Reconstruction , 2009, IEEE Transactions on Neural Networks.

[5]  D. A. Pospelov,et al.  Multi-valued threshold functions , 1971 .

[6]  Masaki Kobayashi,et al.  Fundamental Abilities of Rotor Associative Memory , 2010, 2010 IEEE/ACIS 9th International Conference on Computer and Information Science.

[7]  Masaki Kobayashi,et al.  Complex-valued Associative Memory with Strong Thresholds , 2011 .

[8]  Igor N. Aizenberg Periodic Activation Function and a Modified Learning Algorithm for the Multivalued Neuron , 2010, IEEE Transactions on Neural Networks.

[9]  Igor N. Aizenberg,et al.  Complex-Valued Neural Networks with Multi-Valued Neurons , 2011, Studies in Computational Intelligence.

[10]  Masaki Kobayashi,et al.  Gradient Descent Learning for Rotor Associative Memory , 2011 .

[11]  Yukio Kosugi,et al.  An image storage system using complex-valued associative memories , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[12]  Danilo P. Mandic,et al.  Complex Valued Nonlinear Adaptive Filters , 2009 .

[13]  Masaki Kobayashi,et al.  Reducing Spurious States by Rotor Associative Memory , 2011 .

[14]  Masaki Kobayashi,et al.  Projection rule for complex-valued associative memory with large constant terms , 2012 .

[15]  Masaki Kobayashi,et al.  Rotor Associative Memory with a Periodic Activation Function , 2012, The 2012 International Joint Conference on Neural Networks (IJCNN).

[16]  H. Aoki Image Association Using a Complex-Valued Associative Memory Model , 2000 .

[17]  Munakata,et al.  Neural-network model composed of multidimensional spin neurons. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[18]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[19]  Lucas C. Parra,et al.  Continuous Boltzmann machine with rotor neurons , 1995, Neural Networks.

[20]  Yukio Kosugi,et al.  Rotation-Invariant Image Association for Endoscopic Positional Identification Using Complex-Valued Associative Memories , 2001, IWANN.

[21]  Jacek M. Zurada,et al.  A new design method for the complex-valued multistate Hopfield associative memory , 2003, IEEE Trans. Neural Networks.

[22]  Masaki Kobayashi,et al.  Noise Robust Gradient Descent Learning for Complex-Valued Associative Memory , 2011, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[23]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[24]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[25]  D. Mandic,et al.  Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models , 2009 .

[26]  Carsten Peterson,et al.  Rotor Neurons: Basic Formalism and Dynamics , 1992, Neural Computation.

[27]  André J. Noest,et al.  Phasor Neural Networks , 1987, NIPS.

[28]  Richard S. Zemel,et al.  Lending direction to neural networks , 1995, Neural Networks.

[29]  D. A. Pospelov,et al.  Multivalued threshold functions , 1973 .

[30]  J Cook,et al.  The mean-field theory of a Q-state neural network model , 1989 .

[31]  Masaki Kobayashi,et al.  Chaotic Rotor Associative Memory , 2009 .

[32]  Masaki Kobayashi,et al.  Pseudo-Relaxation Learning Algorithm for Complex-Valued Associative Memory , 2008, Int. J. Neural Syst..

[33]  Jacek M. Zurada,et al.  Complex-valued multistate neural associative memory , 1996, IEEE Trans. Neural Networks.

[34]  Akira Hirose,et al.  Complex-Valued Neural Networks: Theories and Applications , 2003 .

[35]  Naum N. Aizenberg,et al.  CNN based on multi-valued neuron as a model of associative memory for grey scale images , 1992, CNNA '92 Proceedings Second International Workshop on Cellular Neural Networks and Their Applications.

[36]  Donq-Liang Lee Improving the capacity of complex-valued neural networks with a modified gradient descent learning rule , 2001, IEEE Trans. Neural Networks.