Stability Analysis for Delayed Neural Networks via a Generalized Reciprocally Convex Inequality

This article deals with the stability of neural networks (NNs) with time-varying delay. First, a generalized reciprocally convex inequality (RCI) is presented, providing a tight bound for reciprocally convex combinations. This inequality includes some existing ones as special case. Second, in order...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 34; no. 10; pp. 7491 - 7499
Main Authors Lin, Hui-Chao, Zeng, Hong-Bing, Zhang, Xian-Ming, Wang, Wei
Format Journal Article
LanguageEnglish
Published United States IEEE 01.10.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This article deals with the stability of neural networks (NNs) with time-varying delay. First, a generalized reciprocally convex inequality (RCI) is presented, providing a tight bound for reciprocally convex combinations. This inequality includes some existing ones as special case. Second, in order to cater for the use of the generalized RCI, a novel Lyapunov-Krasovskii functional (LKF) is constructed, which includes a generalized delay-product term. Third, based on the generalized RCI and the novel LKF, several stability criteria for the delayed NNs under study are put forward. Finally, two numerical examples are given to illustrate the effectiveness and advantages of the proposed stability criteria.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2022.3144032