Pretext training followed by task-specific fine-tuning has been a successful approach in vision and language domains. This paper proposes a self-supervised pretext training framework tailored to event sequence data. We introduce novel auxiliary tasks (pretext tasks) that encourage the network to learn the coupling relationships between event times and types — a previously untapped source of self-supervision without labels. These pretext tasks unlock foundational representations that are generalizable across different downstream tasks, including next-event prediction for temporal point process models, event sequence classification, and missing event interpolation. Experiments on popular public benchmarks demonstrate the potential of the proposed method across different tasks and data domains.
Bibtex
@inproceedings{
wang2024selfsupervised,
title={Self-Supervised Pretext Tasks for Event Sequence Data from Detecting Misalignment},
author={Yimu Wang and He Zhao and Ruizhi Deng and Frederick Tung and Greg Mori},
booktitle={NeurIPS 2024 Workshop: Self-Supervised Learning – Theory and Practice},
year={2024},
url={https://openreview.net/forum?id=zc101QzXtw}
}
Related Research
-
Minimal LSTMs and GRUs: Simple, Efficient, and Fully Parallelizable
Minimal LSTMs and GRUs: Simple, Efficient, and Fully Parallelizable
L. Feng, F. Tung, and H. Hajimirsadeghi.
Research
-
Infinite-Width Networks from Different Viewpoints: A Comprehensive Collection of Research Tutorials
Infinite-Width Networks from Different Viewpoints: A Comprehensive Collection of Research Tutorials
S. Prince.
Research
-
Unsupervised Event Outlier Detection in Continuous Time
Unsupervised Event Outlier Detection in Continuous Time
S. Nath, K. Y. C. Lui, and S. Liu. Workshop at Conference on Neural Information Processing Systems (NeurIPS)
Publications