Global exponential convergence of Cohen-Grossberg neural networks with time delays

In this paper, we derive a general sufficient condition ensuring global exponential convergence of Cohen-Grossberg neural networks with time delays by constructing a novel Lyapunov functional and smartly estimating its derivative. The proposed condition is related to the convex combinations of the column-sum and the row-sum of the connection matrices and also relaxes the constraints on the network coefficients. Therefore, the proposed condition generalizes some previous results in the literature.

[1]  Stephen Grossberg,et al.  Absolute stability of global pattern formation and parallel memory storage by competitive neural networks , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[2]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Lin-Bao Yang,et al.  Cellular neural networks: theory , 1988 .

[4]  Wang,et al.  Qualitative analysis of Cohen-Grossberg neural networks with multiple delays. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[5]  Lin Wang,et al.  Exponential stability of Cohen-Grossberg neural networks , 2002, Neural Networks.

[6]  Tianping Chen,et al.  Delay-independent stability analysis of Cohen-Grossberg neural networks , 2003 .

[7]  Tianping Chen,et al.  Robust global exponential stability of Cohen-Grossberg neural networks with time delays , 2004, IEEE Transactions on Neural Networks.