Notes on Exact Boundary Values in Residual Minimisation

Johannes Müller, Marius Zeinhofer
Proceedings of Mathematical and Scientific Machine Learning, PMLR 190:231-240, 2022.

Abstract

We analyse the difference in convergence mode using exact versus penalised boundary values for the residual minimisation of PDEs with neural network type ansatz functions, as is commonly done in the context of physics informed neural networks. It is known that using an $L^2$ boundary penalty leads to a loss of regularity of $3/2$ meaning that approximation in $H^2$ yields a posteriori estimates in $H^{1/2}$. These notes demonstrate how this loss of regularity can be circumvented if the functions in the ansatz class satisfy the boundary values exactly. Furthermore, it is shown that in this case, the loss function provides a consistent a posteriori error estimator in $H^2$ norm made by the residual minimisation method. We provide analogue results for linear time dependent problems and discuss the implications of measuring the residual in Sobolev norms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v190-muller22b, title = {Notes on Exact Boundary Values in Residual Minimisation}, author = {M\"{u}ller, Johannes and Zeinhofer, Marius}, booktitle = {Proceedings of Mathematical and Scientific Machine Learning}, pages = {231--240}, year = {2022}, editor = {Dong, Bin and Li, Qianxiao and Wang, Lei and Xu, Zhi-Qin John}, volume = {190}, series = {Proceedings of Machine Learning Research}, month = {15--17 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v190/muller22b/muller22b.pdf}, url = {https://proceedings.mlr.press/v190/muller22b.html}, abstract = {We analyse the difference in convergence mode using exact versus penalised boundary values for the residual minimisation of PDEs with neural network type ansatz functions, as is commonly done in the context of physics informed neural networks. It is known that using an $L^2$ boundary penalty leads to a loss of regularity of $3/2$ meaning that approximation in $H^2$ yields a posteriori estimates in $H^{1/2}$. These notes demonstrate how this loss of regularity can be circumvented if the functions in the ansatz class satisfy the boundary values exactly. Furthermore, it is shown that in this case, the loss function provides a consistent a posteriori error estimator in $H^2$ norm made by the residual minimisation method. We provide analogue results for linear time dependent problems and discuss the implications of measuring the residual in Sobolev norms.} }
Endnote
%0 Conference Paper %T Notes on Exact Boundary Values in Residual Minimisation %A Johannes Müller %A Marius Zeinhofer %B Proceedings of Mathematical and Scientific Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Bin Dong %E Qianxiao Li %E Lei Wang %E Zhi-Qin John Xu %F pmlr-v190-muller22b %I PMLR %P 231--240 %U https://proceedings.mlr.press/v190/muller22b.html %V 190 %X We analyse the difference in convergence mode using exact versus penalised boundary values for the residual minimisation of PDEs with neural network type ansatz functions, as is commonly done in the context of physics informed neural networks. It is known that using an $L^2$ boundary penalty leads to a loss of regularity of $3/2$ meaning that approximation in $H^2$ yields a posteriori estimates in $H^{1/2}$. These notes demonstrate how this loss of regularity can be circumvented if the functions in the ansatz class satisfy the boundary values exactly. Furthermore, it is shown that in this case, the loss function provides a consistent a posteriori error estimator in $H^2$ norm made by the residual minimisation method. We provide analogue results for linear time dependent problems and discuss the implications of measuring the residual in Sobolev norms.
APA
Müller, J. & Zeinhofer, M.. (2022). Notes on Exact Boundary Values in Residual Minimisation. Proceedings of Mathematical and Scientific Machine Learning, in Proceedings of Machine Learning Research 190:231-240 Available from https://proceedings.mlr.press/v190/muller22b.html.

Related Material