A deep-learning prediction model for imbalanced time series data forecasting

Time series forecasting has attracted wide attention in recent decades. However, some time series are imbalanced and show different patterns between special and normal periods, leading to the prediction accuracy degradation of special periods. In this paper, we aim to develop a unified model to alle...

Full description

Saved in:
Bibliographic Details
Published inBig Data Mining and Analytics Vol. 4; no. 4; pp. 266 - 278
Main Authors Hou, Chenyu, Wu, Jiawei, Cao, Bin, Fan, Jing
Format Journal Article
LanguageEnglish
Published Tsinghua University Press 01.12.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Time series forecasting has attracted wide attention in recent decades. However, some time series are imbalanced and show different patterns between special and normal periods, leading to the prediction accuracy degradation of special periods. In this paper, we aim to develop a unified model to alleviate the imbalance and thus improving the prediction accuracy for special periods. This task is challenging because of two reasons: (1) the temporal dependency of series, and (2) the tradeoff between mining similar patterns and distinguishing different distributions between different periods. To tackle these issues, we propose a self-attention-based time-varying prediction model with a two-stage training strategy. First, we use an encoder-decoder module with the multi-head self-attention mechanism to extract common patterns of time series. Then, we propose a time-varying optimization module to optimize the results of special periods and eliminate the imbalance. Moreover, we propose reverse distance attention in place of traditional dot attention to highlight the importance of similar historical values to forecast results. Finally, extensive experiments show that our model performs better than other baselines in terms of mean absolute error and mean absolute percentage error.
ISSN:2096-0654
2097-406X
DOI:10.26599/BDMA.2021.9020011