This work presents a simple, inexpensive, theoretically motivated regularization term to enhance the robustness of deep time-index models for time-series forecasting. Recently, DeepTime demonstrated that this class of models can rival state-of-the-art deep historical-value models on the long time-series forecasting (LTSF) benchmarks. The DeepTime framework comprises two key components: (1) a time-indexed basis parameterized as an implicit neural representation (INR), and (2) a meta-learning formulation that fits observed data to this basis via ridge regression, then extrapolates the result to generate forecasts. Our regularization term encourages the time-indexed basis elements to be more unit standardized and less mutually correlated, intended to enable more robust ridge regression. The regularized variant matches or outperforms DeepTime on all LTSF benchmarks. Moreover, it is significantly more resilient to missing values in the lookback window at test time, enhances forecast accuracy when applied to higher-frequency data than it was trained on, and boosts performance when trained on smaller datasets. Overall, we conclude that our regularized approach sets a new state-of-the-art for deep time-index models.

Bibtex

@article{
sastry2025deeprrtime,
title={Deep{RRT}ime: Robust Time-series Forecasting with a Regularized {INR} Basis},
author={Chandramouli Shama Sastry and Mahdi Gilany and Kry Yik-Chau Lui and Martin Magill and Alexander Pashevich},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2025},
url={https://openreview.net/forum?id=uDRzORdPT7},
note={}
}

Related Research