Partial observations of continuous time-series dynamics at arbitrary time stamps exist in many disciplines. Fitting this type of data using statistical models with continuous dynamics is not only promising at an intuitive level but also has practical benefits, including the ability to generate continuous trajectories and to perform inference on previously unseen time stamps. Despite exciting progress in this area, the existing models still face challenges in terms of their representational power and the quality of their variational approximations. We tackle these challenges with continuous latent process flows (CLPF), a principled architecture decoding continuous latent processes into continuous observable processes using a time-dependent normalizing flow driven by a stochastic differential equation. To optimize our model using maximum likelihood, we propose a novel piecewise construction of a variational posterior process and derive the corresponding variational lower bound using trajectory re-weighting. Our ablation studies demonstrate the effectiveness of our contributions in various inference tasks on irregular time grids. Comparisons to state-of-the-art baselines show our model’s favourable performance on both synthetic and real-world time-series data.
Bibtex
@inproceedings{deng21,
title={{Continuous Latent Process Flows}},
author={Ruizhi Deng and Marcus A. Brubaker and Greg Mori and Andreas M. Lehrmann},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2021}
}
Related Research
-
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
What Constitutes Good Contrastive Learning in Time-Series Forecasting?
C. Zhang, Q. Yan, L. Meng, and T. Sylvain.
Research
-
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
RBC Borealis at International Conference on Learning Representations (ICLR): Machine Learning for a better financial future
Learning And Generalization; Natural Language Processing; Time series Modelling
Research
-
Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation
Self-Supervised Time Series Representation Learning with Temporal-Instance Similarity Distillation
A. Hajimoradlou, L. Pishdad, F. Tung, and M. Karpusha. Workshop at International Conference on Machine Learning (ICML)
Publications