Time Series Forecasting using Recurrent Neural Networks modified by Bayesian Inference in the Learning Process
Typically, time series forecasting is done by using models based directly on the past observations from the same sequence. In these cases, when the model is learning from data, there is not an extra quantity of noiseless data available and computational resources are unlimited. In practice, it is ne...
Saved in:
Published in | 2019 IEEE Colombian Conference on Applications in Computational Intelligence (ColCACI) pp. 1 - 6 |
---|---|
Main Authors | , , , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.06.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Typically, time series forecasting is done by using models based directly on the past observations from the same sequence. In these cases, when the model is learning from data, there is not an extra quantity of noiseless data available and computational resources are unlimited. In practice, it is necessary to deal with finite noisy datasets, which lead to uncertainty about what so appropriate the model is. For this, the employment of models based on Bayesian inference are preferable. Then, probabilities are treated as a way to represent the subjective uncertainty from rational agent, performing an approximated inference by maximizing a lower bound on the marginal likelihood. A modified algorithm using long-short memory recurrent neural networks for time series forecasting was presented. This new approach was chosen in order to be as close as possible to the original series in the sense of minimizing the associated Kullback-Leibler Information Criterion. A simulation study was conducted to evaluate and illustrate results, comparing this approach with Bayesian neural-networks-based algorithms for artificial chaotic time-series. |
---|---|
DOI: | 10.1109/ColCACI.2019.8781984 |