Achieving High Accuracy with PINNs via Energy Natural Gradient Descent

Johannes Müller, Marius Zeinhofer
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:25471-25485, 2023.

Abstract

We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method. As a main motivation we show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an orthogonal projection on the model’s tangent space. We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard optimizers like gradient descent or Adam, even when those are allowed significantly more computation time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-muller23b, title = {Achieving High Accuracy with {PINN}s via Energy Natural Gradient Descent}, author = {M\"{u}ller, Johannes and Zeinhofer, Marius}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {25471--25485}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/muller23b/muller23b.pdf}, url = {https://proceedings.mlr.press/v202/muller23b.html}, abstract = {We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method. As a main motivation we show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an orthogonal projection on the model’s tangent space. We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard optimizers like gradient descent or Adam, even when those are allowed significantly more computation time.} }
Endnote
%0 Conference Paper %T Achieving High Accuracy with PINNs via Energy Natural Gradient Descent %A Johannes Müller %A Marius Zeinhofer %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-muller23b %I PMLR %P 25471--25485 %U https://proceedings.mlr.press/v202/muller23b.html %V 202 %X We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method. As a main motivation we show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an orthogonal projection on the model’s tangent space. We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard optimizers like gradient descent or Adam, even when those are allowed significantly more computation time.
APA
Müller, J. & Zeinhofer, M.. (2023). Achieving High Accuracy with PINNs via Energy Natural Gradient Descent. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:25471-25485 Available from https://proceedings.mlr.press/v202/muller23b.html.

Related Material