Learning Physics-Informed Neural Networks without Stacked Back-propagation

Di He, Shanda Li, Wenlei Shi, Xiaotian Gao, Jia Zhang, Jiang Bian, Liwei Wang, Tie-Yan Liu
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3034-3047, 2023.

Abstract

Physics-Informed Neural Network (PINN) has become a commonly used machine learning approach to solve partial differential equations (PDE). But, facing high-dimensional secondorder PDE problems, PINN will suffer from severe scalability issues since its loss includes second-order derivatives, the computational cost of which will grow along with the dimension during stacked back-propagation. In this work, we develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks. In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein’s Identity, the second-order derivatives can be efficiently calculated without back-propagation. We further discuss the model capacity and provide variance reduction methods to address key limitations in the derivative estimation. Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is significantly faster.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-he23a, title = {Learning Physics-Informed Neural Networks without Stacked Back-propagation}, author = {He, Di and Li, Shanda and Shi, Wenlei and Gao, Xiaotian and Zhang, Jia and Bian, Jiang and Wang, Liwei and Liu, Tie-Yan}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {3034--3047}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/he23a/he23a.pdf}, url = {https://proceedings.mlr.press/v206/he23a.html}, abstract = {Physics-Informed Neural Network (PINN) has become a commonly used machine learning approach to solve partial differential equations (PDE). But, facing high-dimensional secondorder PDE problems, PINN will suffer from severe scalability issues since its loss includes second-order derivatives, the computational cost of which will grow along with the dimension during stacked back-propagation. In this work, we develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks. In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein’s Identity, the second-order derivatives can be efficiently calculated without back-propagation. We further discuss the model capacity and provide variance reduction methods to address key limitations in the derivative estimation. Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is significantly faster.} }
Endnote
%0 Conference Paper %T Learning Physics-Informed Neural Networks without Stacked Back-propagation %A Di He %A Shanda Li %A Wenlei Shi %A Xiaotian Gao %A Jia Zhang %A Jiang Bian %A Liwei Wang %A Tie-Yan Liu %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-he23a %I PMLR %P 3034--3047 %U https://proceedings.mlr.press/v206/he23a.html %V 206 %X Physics-Informed Neural Network (PINN) has become a commonly used machine learning approach to solve partial differential equations (PDE). But, facing high-dimensional secondorder PDE problems, PINN will suffer from severe scalability issues since its loss includes second-order derivatives, the computational cost of which will grow along with the dimension during stacked back-propagation. In this work, we develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks. In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein’s Identity, the second-order derivatives can be efficiently calculated without back-propagation. We further discuss the model capacity and provide variance reduction methods to address key limitations in the derivative estimation. Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is significantly faster.
APA
He, D., Li, S., Shi, W., Gao, X., Zhang, J., Bian, J., Wang, L. & Liu, T.. (2023). Learning Physics-Informed Neural Networks without Stacked Back-propagation. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:3034-3047 Available from https://proceedings.mlr.press/v206/he23a.html.

Related Material