Sub-Sequential Physics-Informed Learning with State Space Model

Chenhui Xu, Dancheng Liu, Yuting Hu, Jiajie Li, Ruiyang Qin, Qingxiao Zheng, Jinjun Xiong
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:69507-69525, 2025.

Abstract

Physics-Informed Neural Networks (PINNs) are a kind of deep-learning-based numerical solvers for partial differential equations (PDEs). Existing PINNs often suffer from failure modes of being unable to propagate patterns of initial conditions. We discover that these failure modes are caused by the simplicity bias of neural networks and the mismatch between PDE’s continuity and PINN’s discrete sampling. We reveal that the State Space Model (SSM) can be a continuous-discrete articulation allowing initial condition propagation, and that simplicity bias can be eliminated by aligning a sequence of moderate granularity. Accordingly, we propose PINNMamba, a novel framework that introduces sub-sequence modeling with SSM. Experimental results show that PINNMamba can reduce errors by up to 86.3% compared with state-of-the-art architecture. Our code is available at Supplementary Material.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-xu25t, title = {Sub-Sequential Physics-Informed Learning with State Space Model}, author = {Xu, Chenhui and Liu, Dancheng and Hu, Yuting and Li, Jiajie and Qin, Ruiyang and Zheng, Qingxiao and Xiong, Jinjun}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {69507--69525}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/xu25t/xu25t.pdf}, url = {https://proceedings.mlr.press/v267/xu25t.html}, abstract = {Physics-Informed Neural Networks (PINNs) are a kind of deep-learning-based numerical solvers for partial differential equations (PDEs). Existing PINNs often suffer from failure modes of being unable to propagate patterns of initial conditions. We discover that these failure modes are caused by the simplicity bias of neural networks and the mismatch between PDE’s continuity and PINN’s discrete sampling. We reveal that the State Space Model (SSM) can be a continuous-discrete articulation allowing initial condition propagation, and that simplicity bias can be eliminated by aligning a sequence of moderate granularity. Accordingly, we propose PINNMamba, a novel framework that introduces sub-sequence modeling with SSM. Experimental results show that PINNMamba can reduce errors by up to 86.3% compared with state-of-the-art architecture. Our code is available at Supplementary Material.} }
Endnote
%0 Conference Paper %T Sub-Sequential Physics-Informed Learning with State Space Model %A Chenhui Xu %A Dancheng Liu %A Yuting Hu %A Jiajie Li %A Ruiyang Qin %A Qingxiao Zheng %A Jinjun Xiong %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-xu25t %I PMLR %P 69507--69525 %U https://proceedings.mlr.press/v267/xu25t.html %V 267 %X Physics-Informed Neural Networks (PINNs) are a kind of deep-learning-based numerical solvers for partial differential equations (PDEs). Existing PINNs often suffer from failure modes of being unable to propagate patterns of initial conditions. We discover that these failure modes are caused by the simplicity bias of neural networks and the mismatch between PDE’s continuity and PINN’s discrete sampling. We reveal that the State Space Model (SSM) can be a continuous-discrete articulation allowing initial condition propagation, and that simplicity bias can be eliminated by aligning a sequence of moderate granularity. Accordingly, we propose PINNMamba, a novel framework that introduces sub-sequence modeling with SSM. Experimental results show that PINNMamba can reduce errors by up to 86.3% compared with state-of-the-art architecture. Our code is available at Supplementary Material.
APA
Xu, C., Liu, D., Hu, Y., Li, J., Qin, R., Zheng, Q. & Xiong, J.. (2025). Sub-Sequential Physics-Informed Learning with State Space Model. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:69507-69525 Available from https://proceedings.mlr.press/v267/xu25t.html.

Related Material