Thermalizer: Stable autoregressive neural emulation of spatiotemporal chaos

Christian Pedersen, Laure Zanna, Joan Bruna
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:48635-48661, 2025.

Abstract

Autoregressive surrogate models (or emulators) of spatiotemporal systems provide an avenue for fast, approximate predictions, with broad applications across science and engineering. At inference time however, these models are generally unable to provide predictions over long time rollouts due to accumulation of errors leading to diverging trajectories. In essence, emulators operate out of distribution, and controlling the online distribution quickly becomes intractable in large-scale settings. To address this fundamental issue, and focusing on time-stationary systems admitting an invariant measure, we leverage diffusion models to obtain an implicit estimator of the score of this invariant measure. We show that this model of the score function can be used to stabilize autoregressive emulator rollouts by applying on-the-fly denoising during inference, a process we call thermalization. Thermalizing an emulator rollout is shown to extend the time horizon of stable predictions by two orders of magnitude in complex systems exhibiting turbulent and chaotic behavior, opening up a novel application of diffusion models in the context of neural emulation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-pedersen25a, title = {Thermalizer: Stable autoregressive neural emulation of spatiotemporal chaos}, author = {Pedersen, Christian and Zanna, Laure and Bruna, Joan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {48635--48661}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/pedersen25a/pedersen25a.pdf}, url = {https://proceedings.mlr.press/v267/pedersen25a.html}, abstract = {Autoregressive surrogate models (or emulators) of spatiotemporal systems provide an avenue for fast, approximate predictions, with broad applications across science and engineering. At inference time however, these models are generally unable to provide predictions over long time rollouts due to accumulation of errors leading to diverging trajectories. In essence, emulators operate out of distribution, and controlling the online distribution quickly becomes intractable in large-scale settings. To address this fundamental issue, and focusing on time-stationary systems admitting an invariant measure, we leverage diffusion models to obtain an implicit estimator of the score of this invariant measure. We show that this model of the score function can be used to stabilize autoregressive emulator rollouts by applying on-the-fly denoising during inference, a process we call thermalization. Thermalizing an emulator rollout is shown to extend the time horizon of stable predictions by two orders of magnitude in complex systems exhibiting turbulent and chaotic behavior, opening up a novel application of diffusion models in the context of neural emulation.} }
Endnote
%0 Conference Paper %T Thermalizer: Stable autoregressive neural emulation of spatiotemporal chaos %A Christian Pedersen %A Laure Zanna %A Joan Bruna %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-pedersen25a %I PMLR %P 48635--48661 %U https://proceedings.mlr.press/v267/pedersen25a.html %V 267 %X Autoregressive surrogate models (or emulators) of spatiotemporal systems provide an avenue for fast, approximate predictions, with broad applications across science and engineering. At inference time however, these models are generally unable to provide predictions over long time rollouts due to accumulation of errors leading to diverging trajectories. In essence, emulators operate out of distribution, and controlling the online distribution quickly becomes intractable in large-scale settings. To address this fundamental issue, and focusing on time-stationary systems admitting an invariant measure, we leverage diffusion models to obtain an implicit estimator of the score of this invariant measure. We show that this model of the score function can be used to stabilize autoregressive emulator rollouts by applying on-the-fly denoising during inference, a process we call thermalization. Thermalizing an emulator rollout is shown to extend the time horizon of stable predictions by two orders of magnitude in complex systems exhibiting turbulent and chaotic behavior, opening up a novel application of diffusion models in the context of neural emulation.
APA
Pedersen, C., Zanna, L. & Bruna, J.. (2025). Thermalizer: Stable autoregressive neural emulation of spatiotemporal chaos. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:48635-48661 Available from https://proceedings.mlr.press/v267/pedersen25a.html.

Related Material