Conditional Diffusion Model with Nonlinear Data Transformation for Time Series Forecasting

J Rishi, Gvs Mothish, Deepak Subramani
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:51703-51723, 2025.

Abstract

Time-series forecasting finds application across domains such as finance, climate science, and energy systems. We introduce the Conditional Diffusion with Nonlinear Data Transformation Model (CN-Diff), a generative framework that employs novel nonlinear transformations and learnable conditions in the forward process for time series forecasting. A new loss formulation for training is proposed, along with a detailed derivation of both forward and reverse process. The new additions improve the diffusion model’s capacity to capture complex time series patterns, thus simplifying the reverse process. Our novel condition facilitates learning an efficient prior distribution. This also reduces the gap between the true negative log-likelihood and its variational approximation. CN-Diff is shown to perform better than other leading time series models on nine real-world datasets. Ablation studies are conducted to elucidate the role of each component of CN-Diff.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-rishi25a, title = {Conditional Diffusion Model with Nonlinear Data Transformation for Time Series Forecasting}, author = {Rishi, J and Mothish, Gvs and Subramani, Deepak}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {51703--51723}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/rishi25a/rishi25a.pdf}, url = {https://proceedings.mlr.press/v267/rishi25a.html}, abstract = {Time-series forecasting finds application across domains such as finance, climate science, and energy systems. We introduce the Conditional Diffusion with Nonlinear Data Transformation Model (CN-Diff), a generative framework that employs novel nonlinear transformations and learnable conditions in the forward process for time series forecasting. A new loss formulation for training is proposed, along with a detailed derivation of both forward and reverse process. The new additions improve the diffusion model’s capacity to capture complex time series patterns, thus simplifying the reverse process. Our novel condition facilitates learning an efficient prior distribution. This also reduces the gap between the true negative log-likelihood and its variational approximation. CN-Diff is shown to perform better than other leading time series models on nine real-world datasets. Ablation studies are conducted to elucidate the role of each component of CN-Diff.} }
Endnote
%0 Conference Paper %T Conditional Diffusion Model with Nonlinear Data Transformation for Time Series Forecasting %A J Rishi %A Gvs Mothish %A Deepak Subramani %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-rishi25a %I PMLR %P 51703--51723 %U https://proceedings.mlr.press/v267/rishi25a.html %V 267 %X Time-series forecasting finds application across domains such as finance, climate science, and energy systems. We introduce the Conditional Diffusion with Nonlinear Data Transformation Model (CN-Diff), a generative framework that employs novel nonlinear transformations and learnable conditions in the forward process for time series forecasting. A new loss formulation for training is proposed, along with a detailed derivation of both forward and reverse process. The new additions improve the diffusion model’s capacity to capture complex time series patterns, thus simplifying the reverse process. Our novel condition facilitates learning an efficient prior distribution. This also reduces the gap between the true negative log-likelihood and its variational approximation. CN-Diff is shown to perform better than other leading time series models on nine real-world datasets. Ablation studies are conducted to elucidate the role of each component of CN-Diff.
APA
Rishi, J., Mothish, G. & Subramani, D.. (2025). Conditional Diffusion Model with Nonlinear Data Transformation for Time Series Forecasting. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:51703-51723 Available from https://proceedings.mlr.press/v267/rishi25a.html.

Related Material