A New Method for Stability Analysis of Recurrent Neural Networks With Interval Time-Varying Delay

This brief deals with the problem of stability analysis for a class of recurrent neural networks (RNNs) with a time-varying delay in a range. Both delay-independent and delay-dependent conditions are derived. For the former, an augmented Lyapunov functional is constructed and the derivative of the s...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural networks Vol. 21; no. 2; pp. 339 - 344
Main Authors Zuo, Zhiqiang, Yang, Cuili, Wang, Yijing
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.02.2010
Institute of Electrical and Electronics Engineers
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This brief deals with the problem of stability analysis for a class of recurrent neural networks (RNNs) with a time-varying delay in a range. Both delay-independent and delay-dependent conditions are derived. For the former, an augmented Lyapunov functional is constructed and the derivative of the state is retained. Since the obtained criterion realizes the decoupling of the Lyapunov function matrix and the coefficient matrix of the neural networks, it can be easily extended to handle neural networks with polytopic uncertainties. For the latter, a new type of delay-range-dependent condition is proposed using the free-weighting matrix technique to obtain a tighter upper bound on the derivative of the Lyapunov-Krasovskii functional. Two examples are given to illustrate the effectiveness and the reduced conservatism of the proposed results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1045-9227
1941-0093
DOI:10.1109/TNN.2009.2037893