Pretext training followed by task-specific fine-tuning has been a successful approach in vision and language domains. This paper proposes a self-supervised pretext training framework tailored to event sequence data. We introduce novel auxiliary tasks (pretext tasks) that encourage the network to learn the coupling relationships between event times and types — a previously untapped source of self-supervision without labels. These pretext tasks unlock foundational representations that are generalizable across different downstream tasks, including next-event prediction for temporal point process models, event sequence classification, and missing event interpolation. Experiments on popular public benchmarks demonstrate the potential of the proposed method across different tasks and data domains.
Bibtex
@inproceedings{
wang2024selfsupervised,
title={Self-Supervised Pretext Tasks for Event Sequence Data from Detecting Misalignment},
author={Yimu Wang and He Zhao and Ruizhi Deng and Frederick Tung and Greg Mori},
booktitle={NeurIPS 2024 Workshop: Self-Supervised Learning – Theory and Practice},
year={2024},
url={https://openreview.net/forum?id=zc101QzXtw}
}
Related Research
-
Detecting Mule Account Fraud with Federated Learning
Detecting Mule Account Fraud with Federated Learning
Research
-
Scalable Temporal Domain Generalization via Prompting
Scalable Temporal Domain Generalization via Prompting
S. Hosseini, M. Zhai, H. Hajimirsadeghi, and F. Tung. Workshop at International Conference on Machine Learning (ICML)
Publications
-
Accurate Parameter-Efficient Test-Time Adaptation for Time Series Forecasting
Accurate Parameter-Efficient Test-Time Adaptation for Time Series Forecasting
H. R. Medeiros, H. Sharifi, G. Oliveira, and S. Irandoust. Workshop at International Conference on Machine Learning (ICML)
Publications