Probabilistic Time Series Modeling with Decomposable Denoising Diffusion Model

Tijin Yan, Hengheng Gong, He Yongping, Yufeng Zhan, Yuanqing Xia
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:55759-55777, 2024.

Abstract

Probabilistic time series modeling based on generative models has attracted lots of attention because of its wide applications and excellent performance. However, existing state-of-the-art models, based on stochastic differential equation, not only struggle to determine the drift and diffusion coefficients during the design process but also have slow generation speed. To tackle this challenge, we firstly propose decomposable denoising diffusion model ($\text{D}^3\text{M}$) and prove it is a general framework unifying denoising diffusion models and continuous flow models. Based on the new framework, we propose some simple but efficient probability paths with high generation speed. Furthermore, we design a module that combines a special state space model with linear gated attention modules for sequence modeling. It preserves inductive bias and simultaneously models both local and global dependencies. Experimental results on 8 real-world datasets show that $\text{D}^3\text{M}$ reduces RMSE and CRPS by up to 4.6% and 4.3% compared with state-of-the-arts on imputation tasks, and achieves comparable results with state-of-the-arts on forecasting tasks with only 10 steps.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-yan24b, title = {Probabilistic Time Series Modeling with Decomposable Denoising Diffusion Model}, author = {Yan, Tijin and Gong, Hengheng and Yongping, He and Zhan, Yufeng and Xia, Yuanqing}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {55759--55777}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/yan24b/yan24b.pdf}, url = {https://proceedings.mlr.press/v235/yan24b.html}, abstract = {Probabilistic time series modeling based on generative models has attracted lots of attention because of its wide applications and excellent performance. However, existing state-of-the-art models, based on stochastic differential equation, not only struggle to determine the drift and diffusion coefficients during the design process but also have slow generation speed. To tackle this challenge, we firstly propose decomposable denoising diffusion model ($\text{D}^3\text{M}$) and prove it is a general framework unifying denoising diffusion models and continuous flow models. Based on the new framework, we propose some simple but efficient probability paths with high generation speed. Furthermore, we design a module that combines a special state space model with linear gated attention modules for sequence modeling. It preserves inductive bias and simultaneously models both local and global dependencies. Experimental results on 8 real-world datasets show that $\text{D}^3\text{M}$ reduces RMSE and CRPS by up to 4.6% and 4.3% compared with state-of-the-arts on imputation tasks, and achieves comparable results with state-of-the-arts on forecasting tasks with only 10 steps.} }
Endnote
%0 Conference Paper %T Probabilistic Time Series Modeling with Decomposable Denoising Diffusion Model %A Tijin Yan %A Hengheng Gong %A He Yongping %A Yufeng Zhan %A Yuanqing Xia %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-yan24b %I PMLR %P 55759--55777 %U https://proceedings.mlr.press/v235/yan24b.html %V 235 %X Probabilistic time series modeling based on generative models has attracted lots of attention because of its wide applications and excellent performance. However, existing state-of-the-art models, based on stochastic differential equation, not only struggle to determine the drift and diffusion coefficients during the design process but also have slow generation speed. To tackle this challenge, we firstly propose decomposable denoising diffusion model ($\text{D}^3\text{M}$) and prove it is a general framework unifying denoising diffusion models and continuous flow models. Based on the new framework, we propose some simple but efficient probability paths with high generation speed. Furthermore, we design a module that combines a special state space model with linear gated attention modules for sequence modeling. It preserves inductive bias and simultaneously models both local and global dependencies. Experimental results on 8 real-world datasets show that $\text{D}^3\text{M}$ reduces RMSE and CRPS by up to 4.6% and 4.3% compared with state-of-the-arts on imputation tasks, and achieves comparable results with state-of-the-arts on forecasting tasks with only 10 steps.
APA
Yan, T., Gong, H., Yongping, H., Zhan, Y. & Xia, Y.. (2024). Probabilistic Time Series Modeling with Decomposable Denoising Diffusion Model. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:55759-55777 Available from https://proceedings.mlr.press/v235/yan24b.html.

Related Material