We propose a self-supervised method for pretraining universal time series representations in
which we learn contrastive representations using similarity distillation along the temporal and instance dimensions. We analyze the effectiveness of both dimensions and evaluate our pre-trained representations on three downstream tasks: time series classification, anomaly detection, and forecasting.
Bibtex
@inproceedings{
hajimoradlou2022selfsupervised,
title={Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation},
author={Ainaz Hajimoradlou and Leila Pishdad and Frederick Tung and Maryna Karpusha},
booktitle={First Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward at ICML 2022},
year={2022},
url={https://openreview.net/forum?id=nhtkdCvVLIh}
}
Related Research
-
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
C. Zhang, Q. Yan, L. Meng, and T. Sylvain.
Research
-
Constant Memory Attention Block
Constant Memory Attention Block
L. Feng, F. Tung, H. Hajimirsadeghi, Y. Bengio, and M. O. Ahmed. Workshop at International Conference on Machine Learning (ICML)
Publications
-
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Learning And Generalization; Natural Language Processing; Time series Modelling
Research