Modeling Temporal Data as Continuous Functions with Stochastic Process Diffusion

Marin Biloš, Kashif Rasul, Anderson Schneider, Yuriy Nevmyvaka, Stephan Günnemann
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:2452-2470, 2023.

Abstract

Temporal data such as time series can be viewed as discretized measurements of the underlying function. To build a generative model for such data we have to model the stochastic process that governs it. We propose a solution by defining the denoising diffusion model in the function space which also allows us to naturally handle irregularly-sampled observations. The forward process gradually adds noise to functions, preserving their continuity, while the learned reverse process removes the noise and returns functions as new samples. To this end, we define suitable noise sources and introduce novel denoising and score-matching models. We show how our method can be used for multivariate probabilistic forecasting and imputation, and how our model can be interpreted as a neural process.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-bilos23a, title = {Modeling Temporal Data as Continuous Functions with Stochastic Process Diffusion}, author = {Bilo\v{s}, Marin and Rasul, Kashif and Schneider, Anderson and Nevmyvaka, Yuriy and G\"{u}nnemann, Stephan}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {2452--2470}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/bilos23a/bilos23a.pdf}, url = {https://proceedings.mlr.press/v202/bilos23a.html}, abstract = {Temporal data such as time series can be viewed as discretized measurements of the underlying function. To build a generative model for such data we have to model the stochastic process that governs it. We propose a solution by defining the denoising diffusion model in the function space which also allows us to naturally handle irregularly-sampled observations. The forward process gradually adds noise to functions, preserving their continuity, while the learned reverse process removes the noise and returns functions as new samples. To this end, we define suitable noise sources and introduce novel denoising and score-matching models. We show how our method can be used for multivariate probabilistic forecasting and imputation, and how our model can be interpreted as a neural process.} }
Endnote
%0 Conference Paper %T Modeling Temporal Data as Continuous Functions with Stochastic Process Diffusion %A Marin Biloš %A Kashif Rasul %A Anderson Schneider %A Yuriy Nevmyvaka %A Stephan Günnemann %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-bilos23a %I PMLR %P 2452--2470 %U https://proceedings.mlr.press/v202/bilos23a.html %V 202 %X Temporal data such as time series can be viewed as discretized measurements of the underlying function. To build a generative model for such data we have to model the stochastic process that governs it. We propose a solution by defining the denoising diffusion model in the function space which also allows us to naturally handle irregularly-sampled observations. The forward process gradually adds noise to functions, preserving their continuity, while the learned reverse process removes the noise and returns functions as new samples. To this end, we define suitable noise sources and introduce novel denoising and score-matching models. We show how our method can be used for multivariate probabilistic forecasting and imputation, and how our model can be interpreted as a neural process.
APA
Biloš, M., Rasul, K., Schneider, A., Nevmyvaka, Y. & Günnemann, S.. (2023). Modeling Temporal Data as Continuous Functions with Stochastic Process Diffusion. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:2452-2470 Available from https://proceedings.mlr.press/v202/bilos23a.html.

Related Material