Event sequences can be modeled by temporal point processes (TPPs) to capture their asynchronous and probabilistic nature. We propose an intensity-free framework that directly models the point process distribution by utilizing normalizing flows. This approach is capable of capturing highly complex temporal distributions and does not rely on restrictive parametric forms. Comparisons with state-of-the-art baseline models on both synthetic and challenging real-life datasets show that the proposed framework is effective at modeling the stochasticity of discrete event sequences.
Bibtex
@misc{mehrasa2019point,
title={Point Process Flows},
author={Nazanin Mehrasa and Ruizhi Deng and Mohamed Osama Ahmed and Bo Chang and Jiawei He and Thibaut Durand and Marcus Brubaker and Greg Mori},
year={2019},
eprint={1910.08281},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Related Research
-
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
C. Zhang, Q. Yan, L. Meng, and T. Sylvain.
Research
-
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Learning And Generalization; Natural Language Processing; Time series Modelling
Research
-
Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation
Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation
A. Hajimoradlou, L. Pishdad, F. Tung, and M. Karpusha. Workshop at International Conference on Machine Learning (ICML)
Publications