Improved upper bound on step-size parameters of discrete-time recurrent neural networks for linear inequality and equation system

In this brief, an improved upper bound on the step-size parameters of a globally convergent discrete-time recurrent neural network (RNN) model proposed recently in the literature for solving the linear inequality and equation system is obtained without needing the original boundedness requirement fo...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems. 1, Fundamental theory and applications Vol. 49; no. 5; pp. 695 - 698
Main Authors Liang, Xue-Bin, Tso, Shiu Kit
Format Journal Article
LanguageEnglish
Published New York IEEE 01.05.2002
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this brief, an improved upper bound on the step-size parameters of a globally convergent discrete-time recurrent neural network (RNN) model proposed recently in the literature for solving the linear inequality and equation system is obtained without needing the original boundedness requirement for the solution set of the linear system while the step-size parameters being allowed different. Consequently, the rate of convergence for the discrete-time RNN model can be improved by setting the step-size parameters as large as possible no matter whether the solution set of the linear system is bounded or not. It is shown by an example that the obtained upper bound is actually tight in the sense that the RNN in the specific example is globally convergent if and only if the step-size parameters are less than the given upper bound. A numerical simulation example of a globally convergent discrete-time RNN for solving a specific linear inequality and equation system with an unbounded solution set is also provided.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1057-7122
1558-1268
DOI:10.1109/TCSI.2002.1001961