Physics-informed machine learning as a kernel method

Nathan Doumèche, Francis Bach, Gérard Biau, Claire Boyer
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:1399-1450, 2024.

Abstract

Physics-informed machine learning combines the expressiveness of data-based approaches with the interpretability of physical models. In this context, we consider a general regression problem where the empirical risk is regularized by a partial differential equation that quantifies the physical inconsistency. We prove that for linear differential priors, the problem can be formulated as a kernel regression task. Taking advantage of kernel theory, we derive convergence rates for the minimizer $\hat f_n$ of the regularized risk and show that $\hat f_n$ converges at least at the Sobolev minimax rate. However, faster rates can be achieved, depending on the physical error. This principle is illustrated with a one-dimensional example, supporting the claim that regularizing the empirical risk with physical information can be beneficial to the statistical performance of estimators.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-doumeche24a, title = {Physics-informed machine learning as a kernel method}, author = {Doum{\`e}che, Nathan and Bach, Francis and Biau, G{\'e}rard and Boyer, Claire}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {1399--1450}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/doumeche24a/doumeche24a.pdf}, url = {https://proceedings.mlr.press/v247/doumeche24a.html}, abstract = {Physics-informed machine learning combines the expressiveness of data-based approaches with the interpretability of physical models. In this context, we consider a general regression problem where the empirical risk is regularized by a partial differential equation that quantifies the physical inconsistency. We prove that for linear differential priors, the problem can be formulated as a kernel regression task. Taking advantage of kernel theory, we derive convergence rates for the minimizer $\hat f_n$ of the regularized risk and show that $\hat f_n$ converges at least at the Sobolev minimax rate. However, faster rates can be achieved, depending on the physical error. This principle is illustrated with a one-dimensional example, supporting the claim that regularizing the empirical risk with physical information can be beneficial to the statistical performance of estimators.} }
Endnote
%0 Conference Paper %T Physics-informed machine learning as a kernel method %A Nathan Doumèche %A Francis Bach %A Gérard Biau %A Claire Boyer %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-doumeche24a %I PMLR %P 1399--1450 %U https://proceedings.mlr.press/v247/doumeche24a.html %V 247 %X Physics-informed machine learning combines the expressiveness of data-based approaches with the interpretability of physical models. In this context, we consider a general regression problem where the empirical risk is regularized by a partial differential equation that quantifies the physical inconsistency. We prove that for linear differential priors, the problem can be formulated as a kernel regression task. Taking advantage of kernel theory, we derive convergence rates for the minimizer $\hat f_n$ of the regularized risk and show that $\hat f_n$ converges at least at the Sobolev minimax rate. However, faster rates can be achieved, depending on the physical error. This principle is illustrated with a one-dimensional example, supporting the claim that regularizing the empirical risk with physical information can be beneficial to the statistical performance of estimators.
APA
Doumèche, N., Bach, F., Biau, G. & Boyer, C.. (2024). Physics-informed machine learning as a kernel method. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:1399-1450 Available from https://proceedings.mlr.press/v247/doumeche24a.html.

Related Material