Random Grid Neural Processes for Parametric Partial Differential Equations

Arnaud Vadeboncoeur, Ieva Kazlauskaite, Yanni Papandreou, Fehmi Cirak, Mark Girolami, Omer Deniz Akyildiz
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:34759-34778, 2023.

Abstract

We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-vadeboncoeur23a, title = {Random Grid Neural Processes for Parametric Partial Differential Equations}, author = {Vadeboncoeur, Arnaud and Kazlauskaite, Ieva and Papandreou, Yanni and Cirak, Fehmi and Girolami, Mark and Akyildiz, Omer Deniz}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {34759--34778}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/vadeboncoeur23a/vadeboncoeur23a.pdf}, url = {https://proceedings.mlr.press/v202/vadeboncoeur23a.html}, abstract = {We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.} }
Endnote
%0 Conference Paper %T Random Grid Neural Processes for Parametric Partial Differential Equations %A Arnaud Vadeboncoeur %A Ieva Kazlauskaite %A Yanni Papandreou %A Fehmi Cirak %A Mark Girolami %A Omer Deniz Akyildiz %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-vadeboncoeur23a %I PMLR %P 34759--34778 %U https://proceedings.mlr.press/v202/vadeboncoeur23a.html %V 202 %X We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.
APA
Vadeboncoeur, A., Kazlauskaite, I., Papandreou, Y., Cirak, F., Girolami, M. & Akyildiz, O.D.. (2023). Random Grid Neural Processes for Parametric Partial Differential Equations. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:34759-34778 Available from https://proceedings.mlr.press/v202/vadeboncoeur23a.html.

Related Material