Improvements of Complex-Valued Hopfield Associative Memory by Using Generalized Projection Rules

In this letter, new design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. We show that the well-known projection rule proposed by Personnaz can be generalized to complex domain such that the weight matrix of the CVHAM can be designed by using a simple and effective method. The stability of the proposed CVHAM is analyzed by using energy function approach which shows that in synchronous update mode the proposed model is guaranteed to converge to a fixed point from any given initial state. Moreover, the projection geometry of the generalized projection rule (GPR) is discussed. In order to enhance the recall capability, a strategy of eliminating the spurious memories is also reported. The validity and the performance of the proposed methods are investigated by computer simulation

[1]  Marc M. Van Hulle,et al.  Monitoring the Formation of Kernel-Based Topographic Maps with Application to Hierarchical Clustering of Music Signals , 2002, J. VLSI Signal Process..

[2]  Jacek M. Zurada,et al.  A new design method for the complex-valued multistate Hopfield associative memory , 2003, IEEE Trans. Neural Networks.

[3]  Van Hulle MM Kernel-Based Equiprobabilistic Topographic Map Formation. , 1998, Neural computation.

[4]  Donq-Liang Lee,et al.  Relaxation of the stability condition of the complex-valued neural networks , 2001, IEEE Trans. Neural Networks.

[5]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[6]  Isabelle Guyon,et al.  A biologically constrained learning mechanism in networks of formal neurons , 1986 .

[7]  P. Lancaster,et al.  The theory of matrices : with applications , 1985 .

[8]  F. R. Gantmakher The Theory of Matrices , 1984 .

[9]  L. Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.

[10]  Jacek M. Zurada,et al.  Complex-valued multistate neural associative memory , 1996, IEEE Trans. Neural Networks.

[11]  Giovanni Costantini,et al.  Neural associative memory storing gray-coded gray-scale images , 2003, IEEE Trans. Neural Networks.

[12]  Anthony N. Michel,et al.  Analysis and synthesis of discrete-time neural networks with multilevel threshold functions , 1991, 1991., IEEE International Sympoisum on Circuits and Systems.

[13]  Marc M. Van Hulle,et al.  Faithful representations with topographic maps , 1999, Neural Networks.

[14]  Panu Somervuo,et al.  How to make large self-organizing maps for nonvectorial data , 2002, Neural Networks.

[15]  Jacek M. Zurada,et al.  Generalized Hopfield networks for associative memories with multi-valued stable states , 1996, Neurocomputing.

[16]  Michel Herbin,et al.  A clustering method based on the estimation of the probability density function and on the skeleton by influence zones. Application to image processing , 1996, Pattern Recognit. Lett..

[17]  David Zipser,et al.  Feature Discovery by Competive Learning , 1986, Cogn. Sci..

[18]  Temujin Gautama,et al.  Batch map extensions of the kernel-based maximum entropy learning rule , 2006, IEEE Trans. Neural Networks.

[19]  Donq-Liang Lee Improving the capacity of complex-valued neural networks with a modified gradient descent learning rule , 2001, IEEE Trans. Neural Networks.

[20]  Thomas Villmann,et al.  Topology preservation in self-organizing feature maps: exact definition and measurement , 1997, IEEE Trans. Neural Networks.

[21]  J. Goodman,et al.  Neural networks for computation: number representations and programming complexity. , 1986, Applied optics.

[22]  Ralf Der,et al.  Controlling the Magnification Factor of Self-Organizing Feature Maps , 1996, Neural Computation.