SFC: a time series decomposition attention network with continuous nature for time series analysis
Time series analysis has attracted considerable research interest because of its wide range of applications, but it remains a challenging task. In recent years, deep learning methods such as CNN and RNN have made significant advances in time series classification and forecasting. However, the contin...
Saved in:
Published in | Data mining and knowledge discovery Vol. 39; no. 5; p. 43 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.09.2025
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
ISSN | 1384-5810 1573-756X |
DOI | 10.1007/s10618-025-01121-7 |
Cover
Loading…
Summary: | Time series analysis has attracted considerable research interest because of its wide range of applications, but it remains a challenging task. In recent years, deep learning methods such as CNN and RNN have made significant advances in time series classification and forecasting. However, the continuous nature of time series data has often been overlooked. It is crucial to recognize that the time series evolves continuously in real time; therefore, models must effectively capture this seamless continuity. In this context, we propose an encoder-decoder architecture attention network called SFC, which integrates the idea of time series decomposition as a deep learning approach to tackle this problem. The goal is to capture the ongoing evolution of the dynamical system in time series data. SFC features two types of modules, each associated with a specific time series component in time series decomposition. The component is modeled through a differential equation with a corresponding special form. The attention mechanisms within SFC are employed to capture the relationships between different data points within a time series, effectively treat them as periodic components, and enable the process of time series decomposition. The information within each differential equation would be decoded, and the SFC with the encoder-decoder-attention structure is then formulated as a task-dependent neural network for time series analysis tasks. Extensive experiments on a variety of real-world datasets with robust competitive baselines demonstrate the outstanding performance of our model. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1384-5810 1573-756X |
DOI: | 10.1007/s10618-025-01121-7 |