23 results on '"Liapunov functions -- Analysis"'
Search Results
2. Neural-network-based terminal sliding-mode control of robotic manipulators including actuator dynamics
- Author
-
Wang, Liangyong, Chai, Tianyou, and Zhai, Lianfei
- Subjects
Liapunov functions -- Analysis ,Neural networks -- Analysis ,Sliding mode control -- Research ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Published
- 2009
3. Global robust stabilizing control for a dynamic neural network system
- Author
-
Liu, Ziqian, Shih, Stephen C., and Wang, Qunjing
- Subjects
Neural network ,Neural networks -- Design and construction ,Hamilton-Jacobi equations -- Analysis ,Liapunov functions -- Analysis - Abstract
This paper presents a new approach for the global robust stabilizing control of a class of dynamic neural network systems. This approach is developed via Lyapunov stability and inverse optimality, which circumvents the task of solving a Hamilton-Jacobi-Isaacs equation. The primary contribution of this paper is the development of a nonlinear [H.sub.[infinity]] control design for a class of dynamic neural network systems, which are usually used in the modeling and control of nonlinear affine systems with unknown nonlinearities. The proposed [H.sub.[infinity]] control design achieves global inverse optimality with respect to some meaningful cost functional, global disturbance attenuation, and global asymptotic stability provided that no disturbance occurs. Finally, four numerical examples are used to demonstrate the effectiveness of the proposed approach. Index Terms--Dynamic neural network system, Hamilton-Jacobi-Isaacs (HJI) equation, inverse optimality, Lyapunov stability, nonlinear [H.sub.[infinity]] control.
- Published
- 2009
4. Synchronization and state estimation for discrete-time complex networks with distributed delays
- Author
-
Liu, Yurong, Wang, Zidong, Liang, Jinling, and Liu, Xiaohui
- Subjects
Neural network ,Discrete-time systems -- Analysis ,Neural networks -- Analysis ,Liapunov functions -- Analysis - Abstract
In this paper, a synchronization problem is investigated for an array of coupled complex discrete-time networks with the simultaneous presence of both the discrete and distributed time delays. The complex networks addressed which include neural and social networks as special cases are quite general. Rather than the commonly used Lipschitz-type function, a more general sector-like nonlinear function is employed to describe the nonlinearities existing in the network. The distributed infinite time delays in the discrete-time domain are first defined. By utilizing a novel Lyapunov-Krasovskii functional and the Kronecker product, it is shown that the addressed discrete-time complex network with distributed delays is synchronized if certain linear matrix inequalities (LMIs) are feasible. The state estimation problem is then studied for the same complex network, where the purpose is to design a state estimator to estimate the network states through available output measurements such that, for all admissible discrete and distributed delays, the dynamics of the estimation error is guaranteed to be globally asymptotically stable. Again, an LMI approach is developed for the state estimation problem. Two simulation examples are provided to show the usefulness of the proposed global synchronization and state estimation conditions. It is worth pointing out that our main results are valid even if the nominal subsystems within the network are unstable. Index Terms--Complex networks, discrete time delays, distributed time delays, linear matrix inequality (LMI), Lyapunov-Krasovskii functional, neural networks, state estimation, synchronization.
- Published
- 2008
5. Improved sufficient conditions for global asymptotic stability of delayed neural networks
- Author
-
Wu, Wei and Cui, Bao Tong
- Subjects
Neural networks -- Analysis ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers and office automation industries ,Electronics ,Electronics and electrical industries - Abstract
This brief addresses the global asymptotic stability (GAS) of delayed neural networks. Based on the Lyapunov method, using some existing results for the existence and uniqueness of the equilibrium point, some sufficient conditions are obtained for checking the GAS without demanding the boundedness and differentiability hypotheses for activation functions. Through comparison, it is illustrated that our conditions extend and improve some recent results. Index Terms--Delayed neural networks, equilibrium point, global asymptotic stability (GAS), Lyapunov functional.
- Published
- 2007
6. A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks
- Author
-
Man, Zhihong, Wu, Hong Ren, Liu, Sophie, and Yu, Xinghuo
- Subjects
Neural networks -- Research ,Liapunov functions -- Analysis ,Convergence (Mathematics) -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A new adaptive backpropagation (BP) algorithm based on Lyapunov stability theory for neural networks is developed in this paper. It is shown that the candidate of a Lyapunov function V (k) of the tracking error between the output of a neural network and the desired reference signal is chosen first, and the weights of the neural network are then updated, from the output layer to the input layer, in the sense that [DELTA]V(k) = V(k) - V(k - 1) < 0. The output tracking error can then asymptotically converge to zero according to Lyapunov stability theory. Unlike gradient-based BP training algorithms, the new Lyapunov adaptive BP algorithm in this paper is not used for searching the global minimum point along the cost-function surface in the weight space, but it is aimed at constructing an energy surface with a single global minimum point through the adaptive adjustment of the weights as the time goes to infinity. Although a neural network may have bounded input disturbances, the effects of the disturbances can be eliminated, and asymptotic error convergence can be obtained. The new Lyapunov adaptive BP algorithm is then applied to the design of an adaptive filter in the simulation example to show the fast error convergence and strong robustness with respect to large bounded input disturbances. Index Terms--Adaptive filtering, backpropagation (BP), convergence, feedforward neural networks, Lyapunov stability.
- Published
- 2006
7. Design of real FIR filters with arbitrary magnitude and phase specifications using a neural-based approach
- Author
-
Jou, Yue-Dar
- Subjects
Neural networks -- Analysis ,Liapunov functions -- Analysis ,Least squares -- Analysis ,Neural network ,Business ,Computers and office automation industries ,Electronics ,Electronics and electrical industries - Abstract
An efficient and yet simple neural-based approach is utilized to design real finite-impulse response filters with arbitrary complex frequency responses in the least-squares sense. The proposed approach establishes the quadratic error difference of the filter optimization in the frequency domain as the Lyapunov energy function. Consequently, the optimal filter coefficients are obtained with good performance and fast convergence speed. To achieve good convergences for large filter lengths, a cooling process of simulated annealing is used for the neural activation function. Several examples and comparisons to the existing methods are presented to illustrate the effectiveness and flexibility of the neural-based method. Index Terms--Finite-impulse response (FIR) filter, Lyapunov energy function, neural network, real-time processing.
- Published
- 2006
8. Globally asymptotic stability of a class of neutral-type neural networks with delays
- Author
-
Cheng, Chao-Jung, Liao, Teh-Lu, Yan, Jun-Juh, and Hwang, Chi-Chuan
- Subjects
Neural network ,Neural networks -- Analysis ,Liapunov functions -- Analysis ,Asymptotic expansions -- Analysis - Abstract
Several stability conditions for a class of systems with retarded-type delays are presented in the literature. However, no results have yet been presented for neural networks with neutral-type delays. Accordingly, this correspondence investigates the globally asymptotic stability of a class of neutral-type neural networks with delays. This class of systems includes Hopfield neural networks, cellular neural networks, and Cohen-Grossberg neural networks. Based on the Lyapunov stability method, two delay-independent sufficient stability conditions are derived. These stability conditions are easily checked and can be derived from the connection matrix and the network parameters without the requirement for any assumptions regarding the symmetry of the interconnections. Two illustrative examples are presented to demonstrate the validity of the proposed stability criteria. Index Terms--Cellular neural networks (CNNs), Cohen-Grossberg neural networks (CGNNs), Hopfield neural networks (HNNs), neutral-type neural networks.
- Published
- 2006
9. A tighter bound for the echo state property
- Author
-
Buehner, Michael and Young, Peter
- Subjects
Liapunov functions -- Analysis ,Neural networks -- Analysis ,Robust statistics -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
This letter provides a brief explanation of echo state networks (ESNs) and provides a rigorous bound for guaranteeing asymptotic stability of these networks. The stability bounds presented here could aid in the design of echo state networks that would be applicable to control applications where stability is required. Index Terms--Echo state networks (ESNs), Lyapunov stability, nonlinear systems, recurrent neural networks (RNN), robust controls, weighted operator norms.
- Published
- 2006
10. Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays
- Author
-
Wang, Zidong, Liu, Yurong, Li, Maozhen, and Liu, Xiaohui
- Subjects
Neural networks -- Analysis ,Liapunov functions -- Analysis ,Stochastic systems -- Usage ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
In this letter, the global asymptotic stability analysis problem is considered for a class of stochastic Cohen-Grossberg neural networks with mixed time delays, which consist of both the discrete and distributed time delays. Based on an Lyapunov-Krasovskii functional and the stochastic stability analysis theory, a linear matrix inequality (LMI) approach is developed to derive several sufficient conditions guaranteeing the global asymptotic convergence of the equilibrium point in the mean square, It is shown that the addressed stochastic Cohen-Grossberg neural networks with mixed delays are globally asymptotically stable in the mean square if two LMIs are feasible, where the feasibility of LMIs can be readily checked by the Matlab LMI toolbox. It is also pointed out that the main results comprise some existing results as special cases. A numerical example is given to demonstrate the usefulness of the proposed global stability criteria. Index Terms--Cohen-Grossberg neural networks, discrete delays, distributed delays, global asymptotic stability, linear matrix inequality (LMI), Lyapunov-Krasovskii functional, stochastic systems.
- Published
- 2006
11. Global asymptotic stability of delayed Cohen-Grossberg neural networks
- Author
-
Chen, Y.
- Subjects
Neural networks -- Analysis ,Delay lines -- Design and construction ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers and office automation industries ,Electronics ,Electronics and electrical industries - Abstract
In this paper, we study the Cohen-Grossberg neural networks with discrete and distributed delays. For a general class of internal decay functions, without assuming the boundedness, differentiability, and monotonicity of the activation functions, we establish some sufficient conditions for the existence of a unique equilibrium and its global asymptotic stability. Theory of M-matrices and Lyapunov functional technique are employed. The criteria are independent of delays and hence delays are harmless in our case. Our results improve and generalize some existing ones. Index Terms--Cohen-Grossberg neural network, discrete delay, distributed delay, equilibrium, global (asymptotic) stability, M-matrix.
- Published
- 2006
12. Neural network control of a class of nonlinear systems with actuator saturation
- Author
-
Gao, Wenzhi and Selmic, Rastko R.
- Subjects
Neural networks -- Research ,Nonlinear networks -- Research ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A neural net (NN)-based actuator saturation compensation scheme for the nonlinear systems in Brunovsky canonical form is presented. The scheme that leads to stability, command following, and disturbance rejection is rigorously proved and verified using a general 'pendulum type' and a robot manipulator dynamical systems. Online weights tuning law, the overall closed-loop system performance, and the boundedness of the NN weights are derived and guaranteed based on Lyapunov approach. The actuator saturation is assumed to be unknown and the saturation compensator is inserted into a feedforward path. Simulation results indicate that the proposed scheme can effectively compensate for the saturation nonlinearity in the presence of system uncertainty. Index Terms--Actuator nonlinearities, Brunovsky canonical form, neural network (NN), saturation compensation, stability.
- Published
- 2006
13. Stability analysis of cohen-grossberg neural networks
- Author
-
Guo, Shangjiang and Huang, Lihong
- Subjects
Neural networks -- Research ,Liapunov functions -- Analysis ,Delay lines -- Research ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Without assuming boundedness and differentiability of the activation functions and any symmetry of interconnections, we employ Lyapunov functions to establish some sufficient conditions ensuring existence, uniqueness, global asymptotic stability, and even global exponential stability of equilibria for the Cohen-Grossberg neural networks with and without delays. Our results are not only presented in terms of system parameters and can be easily verified and also less restrictive than previously known criteria and can be applied to neural networks, including Hopfield neural networks, bidirectional association memory neural networks, and cellular neural networks. Index Terms--Equilibrium, global asymptotic stability (GAS), Lyapunov functions, neural networks, time delays.
- Published
- 2006
14. A stable neural network-based observer with application to flexible-joint manipulators
- Author
-
Abdollahi, Farzaneh, Talebi, H.A., and Patel, Rajnikant V.
- Subjects
Neural networks -- Research ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A stable neural network (NN)-based observer for general multivariable nonlinear systems is presented in this paper. Unlike most previous neural network observers, the proposed observer uses a nonlinear-in-parameters neural network (NLPNN). Therefore, it can be applied to systems with higher degrees of nonlinearity without any a priori knowledge about system dynamics. The learning rule for the neural network is a novel approach based on the modified backpropagation (BP) algorithm. An e-modification term is added to guarantee robustness of the observer. No strictly positive real (SPR) or any other strong assumption is imposed on the proposed approach. The stability of the recurrent neural network observer is shown by Lyapunov's direct method. Simulation results for a flexible-joint manipulator are presented to demonstrate the enhanced performance achieved by utilizing the proposed neural network observer. Index Terms--Flexible joint manipulators, neural networks (NN), nonlinear observer.
- Published
- 2006
15. The self-trapping attractor neural network--Part II: properties of a sparsely connected model storing multiple memories
- Author
-
Pavloski, Raymond and Karimi, Majid
- Subjects
Neural networks -- Analysis ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
In a previous paper [1], the self-trapping network (STN) was introduced as more biologically realistic than attractor neural networks (ANNs) based on the Ising model. This paper extends the previous analysis of a one-dimensional (1-D) STN storing a single memory to a model that stores multiple memories and that possesses generalized sparse connnectivity. The energy, Lyapunov function, and partition function derived for the 1-D model are generalized to the case of an attractor network with only near-neighbor synapses, coupled to a system that computes memory overlaps. Simulations reveal that 1) the STN dramatically reduces intra-ANN connectivity without severly affecting the size of basins of attraction, with fast self-trapping able to sustain attractors even in the absence of intra-ANN synapses; 2) the basins of attraction can be controlled by a single free parameter, providing natural attention-like effects; 3) the same parameter determines the memory capacity of the network, and the latter is much less dependent than a standard ANN on the noise level of the system; 4) the STN serves as a useful memory for some correlated memory patterns for which the standard ANN totally fails; 5) the STN can store a large number of sparse patterns; and 6) a Monte Carlo procedure, a competitive neural network, and binary neurons with thresholds can be used to induce self-trapping. Index Terms--Associative memory, attractor neural network (ANN), connectivity, coupled systems, Hopfield model, Ising model, self-trapping network (STN).
- Published
- 2005
16. Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays
- Author
-
Arik, Sabri
- Subjects
Liapunov functions -- Analysis ,Neural networks -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
This paper presents a sufficient condition for the existence, uniqueness and global asymptotic stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with distributed time delays. The results impose constraint conditions on the network parameters of neural system independently of the delay parameter, and they are applicable to all continuous nonmonotonic neuron activation functions. It is shown that in some special cases of the results, the stability criteria can be easily checked. Some examples are also given to compare the results with the previous results derived in the literature. Index Terms--Delayed neural networks, equilibrium and stability analysis, Lyapunov functionals.
- Published
- 2005
17. Further results on adaptive control for a class of nonlinear systems using neural networks
- Author
-
Huang, Sunan N., Tan, K.K., and Lee, T.H.
- Subjects
Neural networks -- Research ,Adaptive control -- Analysis ,Liapunov functions -- Analysis ,Nonlinear networks -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Zhang et al. presented an excellent neural-network (NN) controller for a class of nonlinear control designs. The singularity issue is completely avoided. Based on a modified Lyapunov function, their lemma illustrates the existence of an ideal control which is important in establishing the NN approximator. In this note, we provide a Lyapunov function to realize an alternative ideal control which is more direct and simpler. The major contributions of this note are divided into two parts. First, it proposes a control scheme which results in a smaller dimensionality of NN than that of Zhang et al. In this way, the proposed NN controller is easier to implement and more reliable for practical purposes. Second, by removing certain restrictions from the design reported by Zhang et al., we further develop a new NN controller, which can be applied to a wider class of systems. Index Terms--Adaptive control, Lyapunov function, neural networks, nonlinear systems.
- Published
- 2003
18. Global exponential stability of competitive neural networks with different time scales. (Letters)
- Author
-
Meyer-Baese, A., Pilyugin, S.S., and Chen, Y.
- Subjects
Neural networks -- Research ,Stability -- Analysis ,Liapunov functions -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
The dynamics of cortical cognitive maps developed by self-organization must include the aspects of long and short-term memory. The behavior of such a neural network is characterized by an equation of neural activity as a fast phenomenon and an equation of synaptic modification as a slow part of the neural system. We present a new method of analyzing the dynamics of a biological relevant system with different time scales based on the theory of flow invariance. We are able to show the conditions under which the solutions of such a system are bounded being less restrictive than with the K-monotone theory, singular perturbation theory, or those based on supervised synaptic learning. We prove the existence and the uniqueness of the equilibrium. A strict Lyapunov function for the flow of a competitive neural system with different time scales is given and based on it we are able to prove the global exponential stability of the equilibrium point. Index Terms--Flow invariance, global exponential stability, multitime scale neural network.
- Published
- 2003
19. On the global stability of delayed neural networks
- Author
-
Zhang, Qiang, Ma, Runnian, Wang, Chao, and Xu, Jin
- Subjects
Neural network ,Neural networks -- Design and construction ,Liapunov functions -- Analysis - Abstract
Lyapunov functional methods, combining with some inequality techniques, are employed to study the global asymptotic stability of delayed neural networks. Without assuming Lipschitz conditions on the activation functions, a new sufficient condition is established. Such criteria allows us to include non-Lipschitzian activation functions in the design of delayed neural networks. The result presented here is also discussed from the point of view of its relationship to some previous results. Index Terms--Delayed neural networks, global asymptotic stability, Lyapunov functional, non-Lipschitzian activation functions.
- Published
- 2003
20. A reinforcement discrete neuro-adaptive control for unknown piezoelectric actuator systems with dominant hysteresis
- Author
-
Hwang, Chih-Lyang and Jan, Chau
- Subjects
Neural networks -- Research ,Hysteresis -- Analysis ,Liapunov functions -- Analysis ,Piezoelectric devices ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
The theoretical and experimental studies of a reinforcement discrete neuro-adaptive control for unknown piezoelectric actuator systems with dominant hysteresis are presented. Two separate nonlinear gains, together with an unknown linear dynamical system, construct the nonlinear model (NM) of the piezoelectric actuator systems. A nonlinear inverse control (NIC) according to the learned NM is then designed to compensate the hysteretic phenomenon and to track the reference input without the risk of discontinuous response. Because the uncertainties are dynamic, a recurrent neural network (RNN) with residue compensation is employed to model them in a compact subset. Then, a discrete neuro-adaptive sliding-mode control (DNASMC) is designed to enhance the system performance. The stability of the overall system is verified by Lyapunov stability theory. Comparative experiments for various control schemes are also given to confirm the validity of the proposed control. Index Terms--Hysteresis, learning law with projection, piezoelectric actuator, recurrent neural network (RNN), sliding-mode control.
- Published
- 2003
21. Estimating the Lyapunov Exponent of a chaotic system with nonparametric regression
- Author
-
McCaffrey, Daniel F., Ellner, Stephen, Gallant, A. Ronald, and Nychka, Douglas W.
- Subjects
Neural networks -- Usage ,Chaos theory -- Research ,Liapunov functions -- Analysis ,Nonparametric statistics -- Usage ,Nonparametric statistics -- Methods ,Neural network ,Mathematics - Abstract
We discuss procedures based on nonparametric regression for estimating the dominant Lyapunov Exponent [[lambda].sub.1] from time series data generated by a nonlinear autoregressive system with additive noise. For systems with bounded fluctuations, [[lambda].sub.1] > 0 is the defining feature of chaos. Thus our procedures can be used to examine time series data for evidence of chaotic dynamics. We show that a consistent estimator of the partial derivatives of the autoregression function can be used to obtain a consistent estimator of [[lambda].sub.1]. The rate of convergence we establish is quite slow; a better rate of convergence is derived heuristically and supported by simulations. Simulation results from several implementations--one 'local' (thin-plate splines) and three 'global' (neural nets, radial basis functions, and projection pursuit)--are presented for two deterministic chaotic systems. Local splines and neural nets yield accurate estimates of the Lyapunov exponent; however, the spline method is sensitive to the choice of the embedding dimension. Limited results for a noisy system suggest that the thin-plate spline and neural net regression methods also provide reliable values of the Lyapunov exponent in this case. KEY WORDS: Dynamical systems; Neural networks; Nonlinear dynamics; Nonlinear time series models; Projection pursuit regression; Thin plate smoothing splines., 1. INTRODUCTION Nonlinear dynamical systems (e.g., difference or differential equations) can behave in ways that are hard to distinguish from a random process. This phenomenon, called chaos, is now recognized [...]
- Published
- 1992
22. An improved global asymptotic stability criterion for delayed cellular neural networks
- Author
-
He, Yong, Wu, Min, and She, Jin-Hua
- Subjects
Neural networks -- Usage ,Liapunov functions -- Analysis ,Asymptotic expansions -- Analysis ,Neural network ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A new Lyapunov-Krasovskii functional is constructed for delayed cellular neural networks, and the S-procedure is employed to handle the nonlinearities. An improved global asymptotic stability criterion is also derived that is a generalization of, and an improvement over, previous results. Numerical examples demonstrate the effectiveness of the criterion. Index Terms--Delayed cellular neural networks, global asymptotic stability, linear matrix inequality (LMI), S-procedure.
- Published
- 2006
23. Robust stability analysis of adaptive control based on recurrent ANN
- Author
-
Zerkaoui, Salem, Druaux, Fabrice, Leclercq, Edouard, and Lefebvre, Dimitri
- Subjects
Liapunov functions -- Analysis ,Neural networks -- Forecasts and trends ,Dynamical systems -- Evaluation ,Neural network ,Market trend/market analysis ,Engineering and manufacturing industries ,Science and technology - Abstract
Byline: Salem Zerkaoui, Fabrice Druaux, Edouard Leclercq, Dimitri Lefebvre Adaptive control by means of neural networks for non-linear dynamical systems is an open issue. For real world applications, practitioners have to pay attention to external disturbances, parameters uncertainty and measurement noise, as long as these factors will influence the stability of the closed loop system. As a consequence, robust stability of the closed loop controlled by neural network is an important issue that must be considered. Our contribution concerns the robustness analysis and synthesis of adaptive indirect control scheme. This scheme is based on fully connected neural networks, and inspired from the standard real time recurrent learning. This analysis is concerned by combining Lyapunov approach and linearisation around the nominal parameters to establish analytical sufficient conditions for the global robust stability of adaptive neural network controller. Advantages of the proposed algorithm are suggested according to simulation examples.
- Published
- 2008
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.