Back to Search
Start Over
Global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays
- Source :
- Chaos, Solitons & Fractals. 40:344-354
- Publication Year :
- 2009
- Publisher :
- Elsevier BV, 2009.
-
Abstract
- This paper considers the global output convergence of Cohen–Grossberg neural networks with both time-varying and distributed delays. The inputs of the neural networks are required to be time-varying and the activation functions should be globally Lipschitz continuous and monotonely nondecreasing. Based on M-matrix theory, several sufficient conditions are established to guarantee the global output convergence of this class of neural networks. Symmetry in the connection weight matrices and the boundedness of the activation functions are abandoned in this paper. The convergence results are useful in solving some optimization problems and the design of Cohen–Grossberg neural networks with both time-varying and distributed delays. Two examples are given to illustrate the effectiveness of our results.
- Subjects :
- Mathematical optimization
Optimization problem
Artificial neural network
Control theory
General Mathematics
Applied Mathematics
Convergence (routing)
Connection (vector bundle)
Activation function
General Physics and Astronomy
Statistical and Nonlinear Physics
Lipschitz continuity
Mathematics
Subjects
Details
- ISSN :
- 09600779
- Volume :
- 40
- Database :
- OpenAIRE
- Journal :
- Chaos, Solitons & Fractals
- Accession number :
- edsair.doi...........12fca7e4213e6ac26e1d4964021bd7b4