Understanding the difficulty of solving Cauchy problems with PINNs

Tao Wang, Bo Zhao, Sicun Gao, Rose Yu
Proceedings of the 6th Annual Learning for Dynamics & Control Conference, PMLR 242:453-465, 2024.

Abstract

Physics-Informed Neural Networks (PINNs) have gained popularity in scientific computing in recent years. However, they often fail to achieve the same level of accuracy as classical methods in solving differential equations. In this paper, we aim to understand this issue from two perspectives in the case of Cauchy problems: the use of $L^2$ residuals as objective functions and the approximation gap of neural networks. We show that minimizing the sum of $L^2$ residual and initial condition error is not sufficient to guarantee the true solution, as this loss function does not capture the underlying dynamics. Additionally, neural networks are not capable of capturing singularities in the solutions due to the non-compactness of their image sets. This, in turn, influences the existence of global minima and the regularity of the network. We demonstrate that when the global minimum does not exist, machine precision becomes the predominant source of achievable error in practice. We also present numerical experiments in support of our theoretical claims.

Cite this Paper


BibTeX
@InProceedings{pmlr-v242-wang24b, title = {Understanding the difficulty of solving {C}auchy problems with {PINN}s}, author = {Wang, Tao and Zhao, Bo and Gao, Sicun and Yu, Rose}, booktitle = {Proceedings of the 6th Annual Learning for Dynamics & Control Conference}, pages = {453--465}, year = {2024}, editor = {Abate, Alessandro and Cannon, Mark and Margellos, Kostas and Papachristodoulou, Antonis}, volume = {242}, series = {Proceedings of Machine Learning Research}, month = {15--17 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v242/wang24b/wang24b.pdf}, url = {https://proceedings.mlr.press/v242/wang24b.html}, abstract = {Physics-Informed Neural Networks (PINNs) have gained popularity in scientific computing in recent years. However, they often fail to achieve the same level of accuracy as classical methods in solving differential equations. In this paper, we aim to understand this issue from two perspectives in the case of Cauchy problems: the use of $L^2$ residuals as objective functions and the approximation gap of neural networks. We show that minimizing the sum of $L^2$ residual and initial condition error is not sufficient to guarantee the true solution, as this loss function does not capture the underlying dynamics. Additionally, neural networks are not capable of capturing singularities in the solutions due to the non-compactness of their image sets. This, in turn, influences the existence of global minima and the regularity of the network. We demonstrate that when the global minimum does not exist, machine precision becomes the predominant source of achievable error in practice. We also present numerical experiments in support of our theoretical claims.} }
Endnote
%0 Conference Paper %T Understanding the difficulty of solving Cauchy problems with PINNs %A Tao Wang %A Bo Zhao %A Sicun Gao %A Rose Yu %B Proceedings of the 6th Annual Learning for Dynamics & Control Conference %C Proceedings of Machine Learning Research %D 2024 %E Alessandro Abate %E Mark Cannon %E Kostas Margellos %E Antonis Papachristodoulou %F pmlr-v242-wang24b %I PMLR %P 453--465 %U https://proceedings.mlr.press/v242/wang24b.html %V 242 %X Physics-Informed Neural Networks (PINNs) have gained popularity in scientific computing in recent years. However, they often fail to achieve the same level of accuracy as classical methods in solving differential equations. In this paper, we aim to understand this issue from two perspectives in the case of Cauchy problems: the use of $L^2$ residuals as objective functions and the approximation gap of neural networks. We show that minimizing the sum of $L^2$ residual and initial condition error is not sufficient to guarantee the true solution, as this loss function does not capture the underlying dynamics. Additionally, neural networks are not capable of capturing singularities in the solutions due to the non-compactness of their image sets. This, in turn, influences the existence of global minima and the regularity of the network. We demonstrate that when the global minimum does not exist, machine precision becomes the predominant source of achievable error in practice. We also present numerical experiments in support of our theoretical claims.
APA
Wang, T., Zhao, B., Gao, S. & Yu, R.. (2024). Understanding the difficulty of solving Cauchy problems with PINNs. Proceedings of the 6th Annual Learning for Dynamics & Control Conference, in Proceedings of Machine Learning Research 242:453-465 Available from https://proceedings.mlr.press/v242/wang24b.html.

Related Material