Global Finite-time Stability for Fractional-order Neural Networks

This paper is concerned with the global Mittag-Leffler stability (GMLS) and global finite-time stability (GFTS) for fractional Hopfield neural networks (FHNNs) with Hölder neuron activation functions subject to nonlinear growth. Firstly, four functions possessing convexity are proposed, which can gu...

Full description

Saved in:
Bibliographic Details
Published inOptical memory & neural networks Vol. 29; no. 2; pp. 77 - 99
Main Author Xiaolong Hu
Format Journal Article
LanguageEnglish
Published Moscow Pleiades Publishing 01.04.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper is concerned with the global Mittag-Leffler stability (GMLS) and global finite-time stability (GFTS) for fractional Hopfield neural networks (FHNNs) with Hölder neuron activation functions subject to nonlinear growth. Firstly, four functions possessing convexity are proposed, which can guarantee that four formulas with respect to the fractional derivative are established. Correspondingly, a novel principle of convergence in finite-time for FHNNs is developed based on the proposed formulas. In addition, by applying the Brouwer topological degree theory and inequality analysis techniques, the proof of the existence and uniqueness of equilibrium point is addressed. Subsequently, by means of the Lur’e-type Postnikov Lyapunov functional approach, and the presented principle of convergence in finite-time, the GMLS and GFTS conditions are achieved in terms of linear matrix inequalities (LMIs). Moreover, the upper bound of the setting time for the GFTS is calculated accurately. Finally, three numerical examples are given to verify the validity of the theoretical results.
ISSN:1060-992X
1934-7898
DOI:10.3103/S1060992X20020046