Reservoir computing-based advance warning of extreme events
Physics-based computing exploits nonlinear or disorder-induced complexity, for example, to realize energy-efficient and high-throughput computing tasks. A particularly difficult but useful task is the prediction of extreme events that can occur in a wide range of complex systems. We prepare an exper...
Saved in:
Published in | Chaos, solitons and fractals Vol. 181; p. 114673 |
---|---|
Main Authors | , , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Ltd
01.04.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Physics-based computing exploits nonlinear or disorder-induced complexity, for example, to realize energy-efficient and high-throughput computing tasks. A particularly difficult but useful task is the prediction of extreme events that can occur in a wide range of complex systems. We prepare an experiment based on a microcavity semiconductor laser that produces statistically rare extreme events resulting from the interplay of deterministic nonlinear dynamics and spontaneous emission noise. We then evaluate the performance of three reservoir computing training approaches in predicting the occurrence of extreme events. We show that Dual Training Reservoir Computing (which in turn can be implemented with fast semiconductor laser dynamics) can provide meaningful early warnings up to 15 times the typical linear correlation time of the dynamics.
[Display omitted]
•We experimentally generate and numerically reproduce Extreme Events (EEs).•EEs are successfully predicted with three Reservoir Computing (RC) schemes.•Single (SRC), Parallel (PRC), and Dual Training RCs (DTRCs) are realized.•DTRC’s accuracy remains larger than 50% beyond 30 EE autocorrelation times.•The DTRC network shows a stable, peaked distribution of predicted EE arrival times. |
---|---|
ISSN: | 0960-0779 1873-2887 |
DOI: | 10.1016/j.chaos.2024.114673 |