A Supervised Contrastive Learning Pretrain-Finetune Approach for Time Series
Foundation models have recently gained attention within the field of machine learning thanks to its efficiency in broad data processing. While researchers had attempted to extend this success to time series models, the main challenge is effectively extracting representations and transferring knowled...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
20.11.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Foundation models have recently gained attention within the field of machine
learning thanks to its efficiency in broad data processing. While researchers
had attempted to extend this success to time series models, the main challenge
is effectively extracting representations and transferring knowledge from
pretraining datasets to the target finetuning dataset. To tackle this issue, we
introduce a novel pretraining procedure that leverages supervised contrastive
learning to distinguish features within each pretraining dataset. This
pretraining phase enables a probabilistic similarity metric, which assesses the
likelihood of a univariate sample being closely related to one of the
pretraining datasets. Subsequently, using this similarity metric as a guide, we
propose a fine-tuning procedure designed to enhance the accurate prediction of
the target data by aligning it more closely with the learned dynamics of the
pretraining datasets. Our experiments have shown promising results which
demonstrate the efficacy of our approach. |
---|---|
DOI: | 10.48550/arxiv.2311.12290 |