Globally exponential stability for Hopfield neural networks

Hopfield neural networks are usually discussed under the assumption that all output response functions are smooth and monotone increasing.However,output responses are nonsmooth in most practical applications.In this paper,continuous differentiable conditios of output response functions of Hopfied neural networks in usual papers is reduced to Lipschitz condition.A theorem on globally exponential convergence of solutions of the networks is shown by a Lapunov functional.Some new criteria on globally exponential stability of the networks are obtained.These results greatly improve the main results of recent related papers.