LTScoder: Long-Term Time Series Forecasting Based on a Linear Autoencoder Architecture

A long-term time series forecasting (LTSF) model named LTScoder based on a linear autoencoder architecture is presented in this paper. LTScoder performs feature extraction through an encoder to generate a latent vector and conducts time series prediction based on this latent vector through a decoder...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 12; pp. 98623 - 98633
Main Authors Kim, Geunyong, Yoo, Hark, Kim, Chorwon, Kim, Ryangsoo, Kim, Sungchang
Format Journal Article
LanguageEnglish
Published IEEE 2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A long-term time series forecasting (LTSF) model named LTScoder based on a linear autoencoder architecture is presented in this paper. LTScoder performs feature extraction through an encoder to generate a latent vector and conducts time series prediction based on this latent vector through a decoder. The latent vector, which is mapped to a lower dimensionality by extracting features from the input time series, not only reduces the complexity of the linear regression operations performed in the decoder but also captures essential temporal dependencies for forecasting. Experimental results conducted on popular univariate and multivariate datasets with diverse patterns demonstrate that LTScoder achieves up to 16.95 times faster inference time than PatchTST, the state-of-the-art transformer-based model. LTScoder exhibits minimal degradation in prediction accuracy while using only 2.11% of PatchTST's parameters. Furthermore, we show that the LTScoder model is well-suited for edge computing in time series data prediction tasks. The architectural flexibility of LTScoder allows the encoder and decoder to be deployed in separate systems, enhancing scalability and efficiency. We measure the performance of LTScoder in terms of its prediction accuracy and computational complexity using the ETRI edge gateway system.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3428479