Robformer: A robust decomposition transformer for long-term time series forecasting

Transformer-based forecasting methods have been widely applied to forecast long-term multivariate time series, which achieves significant improvements on extending the forecasting time. However, their performance can degenerate terribly when abrupt trend shift and seasonal fluctuation arise in long-...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 153; p. 110552
Main Authors Yu, Yang, Ma, Ruizhe, Ma, Zongmin
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Transformer-based forecasting methods have been widely applied to forecast long-term multivariate time series, which achieves significant improvements on extending the forecasting time. However, their performance can degenerate terribly when abrupt trend shift and seasonal fluctuation arise in long-term time series. Hence, we identify two bottlenecks of previous Transformers architecture: (1) the robustless decomposition module and (2) trend shifting problem. These result in a different distribution between the trend prediction and ground truth in the long-term multivariate series forecasting. Towards these bottlenecks, we design Robformer as a novel decomposition-based Transformer, which consists of three new inner module to enhance the predictability of Transformers. Concretely, we renew the decomposition module and add a seasonal component adjustment module to tackle the unstationarized series. Further, we propose a novel inner trend forecasting architecture inspired by polynomial fitting method, which outperforms previous design in accuracy and robustness. Our empirical studies show that Robformer can achieve 17% and 10% relative improvements than state-of-the-art Autoformer and FEDformer baselines under the fair long-term multivariate setting on six benchmarks, covering five mainstream time series forecasting applications: energy, economics, traffic, weather, and disease. The code will be released upon publication. •Transformer models are effective for long-term multivariate time series forecasting.•Bottlenecks of decomposition Transformer for time series forecasting are proposed.•A robust Transformer to tackle the distribution shift of time series is proposed.•Robformer achieves 10% to 17% relative improvements than state-of-the-art models.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2024.110552