Multi-step-ahead prediction using dynamic recurrent neural networks

A method for the development of empirical predictive models for complex processes is presented. The models are capable of performing accurate multi-step-ahead (MS) predictions, while maintaining acceptable single-step-ahead (SS) prediction accuracy. Such predictors find applications in model predict...

Full description

Saved in:
Bibliographic Details
Published inNeural networks Vol. 13; no. 7; pp. 765 - 786
Main Authors Parlos, A.G., Rais, O.T., Atiya, A.F.
Format Journal Article
LanguageEnglish
Published Oxford Elsevier Ltd 01.09.2000
Elsevier Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A method for the development of empirical predictive models for complex processes is presented. The models are capable of performing accurate multi-step-ahead (MS) predictions, while maintaining acceptable single-step-ahead (SS) prediction accuracy. Such predictors find applications in model predictive controllers and in fault diagnosis systems. The proposed method makes use of dynamic recurrent neural networks in the form of a nonlinear infinite impulse response (IIR) filter. A learning algorithm is presented, which is based on a dynamic gradient descent approach. The effectiveness of the method for accurate MS prediction is tested on an artificial problem and on a complex, open-loop unstable process. Comparative results are presented with polynomial Nonlinear AutoRegressive with eXogeneous (NARX) predictors, and with recurrent networks trained using teacher forcing. Validation studies indicate that excellent generalization is obtained for the range of operational dynamics studied. The research demonstrates that the proposed network architecture and the associated learning algorithm are quite effective in modeling the dynamics of complex processes and performing accurate MS predictions.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0893-6080
1879-2782
DOI:10.1016/S0893-6080(00)00048-4