Zebra: In-Context Generative Pretraining for Solving Parametric PDEs

Louis Serrano, Armand Kassaı̈ Koupaı̈, Thomas X Wang, Pierre Erbacher, Patrick Gallinari
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:53940-53988, 2025.

Abstract

Solving time-dependent parametric partial differential equations (PDEs) is challenging for data-driven methods, as these models must adapt to variations in parameters such as coefficients, forcing terms, and initial conditions. State-of-the-art neural surrogates perform adaptation through gradient-based optimization and meta-learning to implicitly encode the variety of dynamics from observations. This often comes with increased inference complexity. Inspired by the in-context learning capabilities of large language models (LLMs), we introduce Zebra, a novel generative auto-regressive transformer designed to solve parametric PDEs without requiring gradient adaptation at inference. By leveraging in-context information during both pre-training and inference, Zebra dynamically adapts to new tasks by conditioning on input sequences that incorporate context example trajectories. As a generative model, Zebra can be used to generate new trajectories and allows quantifying the uncertainty of the predictions. We evaluate Zebra across a variety of challenging PDE scenarios, demonstrating its adaptability, robustness, and superior performance compared to existing approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-serrano25a, title = {Zebra: In-Context Generative Pretraining for Solving Parametric {PDE}s}, author = {Serrano, Louis and Kassa\"{\i} Koupa\"{\i}, Armand and Wang, Thomas X and Erbacher, Pierre and Gallinari, Patrick}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {53940--53988}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/serrano25a/serrano25a.pdf}, url = {https://proceedings.mlr.press/v267/serrano25a.html}, abstract = {Solving time-dependent parametric partial differential equations (PDEs) is challenging for data-driven methods, as these models must adapt to variations in parameters such as coefficients, forcing terms, and initial conditions. State-of-the-art neural surrogates perform adaptation through gradient-based optimization and meta-learning to implicitly encode the variety of dynamics from observations. This often comes with increased inference complexity. Inspired by the in-context learning capabilities of large language models (LLMs), we introduce Zebra, a novel generative auto-regressive transformer designed to solve parametric PDEs without requiring gradient adaptation at inference. By leveraging in-context information during both pre-training and inference, Zebra dynamically adapts to new tasks by conditioning on input sequences that incorporate context example trajectories. As a generative model, Zebra can be used to generate new trajectories and allows quantifying the uncertainty of the predictions. We evaluate Zebra across a variety of challenging PDE scenarios, demonstrating its adaptability, robustness, and superior performance compared to existing approaches.} }
Endnote
%0 Conference Paper %T Zebra: In-Context Generative Pretraining for Solving Parametric PDEs %A Louis Serrano %A Armand Kassaı̈ Koupaı̈ %A Thomas X Wang %A Pierre Erbacher %A Patrick Gallinari %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-serrano25a %I PMLR %P 53940--53988 %U https://proceedings.mlr.press/v267/serrano25a.html %V 267 %X Solving time-dependent parametric partial differential equations (PDEs) is challenging for data-driven methods, as these models must adapt to variations in parameters such as coefficients, forcing terms, and initial conditions. State-of-the-art neural surrogates perform adaptation through gradient-based optimization and meta-learning to implicitly encode the variety of dynamics from observations. This often comes with increased inference complexity. Inspired by the in-context learning capabilities of large language models (LLMs), we introduce Zebra, a novel generative auto-regressive transformer designed to solve parametric PDEs without requiring gradient adaptation at inference. By leveraging in-context information during both pre-training and inference, Zebra dynamically adapts to new tasks by conditioning on input sequences that incorporate context example trajectories. As a generative model, Zebra can be used to generate new trajectories and allows quantifying the uncertainty of the predictions. We evaluate Zebra across a variety of challenging PDE scenarios, demonstrating its adaptability, robustness, and superior performance compared to existing approaches.
APA
Serrano, L., Kassaı̈ Koupaı̈, A., Wang, T.X., Erbacher, P. & Gallinari, P.. (2025). Zebra: In-Context Generative Pretraining for Solving Parametric PDEs. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:53940-53988 Available from https://proceedings.mlr.press/v267/serrano25a.html.

Related Material