A Supervised Contrastive Learning Pretrain-Finetune Approach for Time Series

Foundation models have recently gained attention within the field of machine learning thanks to its efficiency in broad data processing. While researchers had attempted to extend this success to time series models, the main challenge is effectively extracting representations and transferring knowled...

Full description

Saved in:
Bibliographic Details
Main Authors Tran, Trang H, Nguyen, Lam M, Yeo, Kyongmin, Nguyen, Nam, Vaculin, Roman
Format Journal Article
LanguageEnglish
Published 20.11.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Foundation models have recently gained attention within the field of machine learning thanks to its efficiency in broad data processing. While researchers had attempted to extend this success to time series models, the main challenge is effectively extracting representations and transferring knowledge from pretraining datasets to the target finetuning dataset. To tackle this issue, we introduce a novel pretraining procedure that leverages supervised contrastive learning to distinguish features within each pretraining dataset. This pretraining phase enables a probabilistic similarity metric, which assesses the likelihood of a univariate sample being closely related to one of the pretraining datasets. Subsequently, using this similarity metric as a guide, we propose a fine-tuning procedure designed to enhance the accurate prediction of the target data by aligning it more closely with the learned dynamics of the pretraining datasets. Our experiments have shown promising results which demonstrate the efficacy of our approach.
DOI:10.48550/arxiv.2311.12290