Back to Search Start Over

Global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays

Authors :
Xuyang Lou
Yan Ji
Baotong Cui
Source :
Chaos, Solitons & Fractals. 40:344-354
Publication Year :
2009
Publisher :
Elsevier BV, 2009.

Abstract

This paper considers the global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays. The inputs of the neural networks are required to be time-varying and the activation functions should be globally Lipschitz continuous and monotonely nondecreasing. Based on M-matrix theory, several sufficient conditions are established to guarantee the global output convergence of this class of neural networks. Symmetry in the connection weight matrices and the boundedness of the activation functions are abandoned in this paper. The convergence results are useful in solving some optimization problems and the design of Cohen–Grossberg neural networks with both time-varying and distributed delays. Two examples are given to illustrate the effectiveness of our results.

Details

ISSN :
09600779
Volume :
40
Database :
OpenAIRE
Journal :
Chaos, Solitons & Fractals
Accession number :
edsair.doi...........12fca7e4213e6ac26e1d4964021bd7b4