Delay-induced self-organization dynamics in a prey-predator network with diffusion
Considering that time delay (delay) is a common phenomenon in biological systems, reaction-diffusion equations with delay are widely used to study the dynamic mechanism of those systems, in which delay can induce the loss of stability and degradation of performance. In this paper, taking into accoun...
Saved in:
Published in | Nonlinear dynamics Vol. 108; no. 4; pp. 4499 - 4510 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Dordrecht
Springer Netherlands
01.06.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Considering that time delay (delay) is a common phenomenon in biological systems, reaction-diffusion equations with delay are widely used to study the dynamic mechanism of those systems, in which delay can induce the loss of stability and degradation of performance. In this paper, taking into account the inhomogeneous distribution of species in space and this can be considered as a random network, the pattern dynamics of a prey-predator network system with diffusion and delay are investigated. The effect of delay and diffusion on the network system is obtained by linear stability analysis, including the stability and Hopf bifurcation as well as Turing pattern. Our results show that the stability of the system changes with the value of delay. Moreover, we obtain Turing pattern related to the network connection probability and diffusion. Finally, the numerical simulation verifies our results. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0924-090X 1573-269X |
DOI: | 10.1007/s11071-022-07431-5 |