Latent Mamba Operator for Partial Differential Equations

Karn Tiwari, Niladri Dutta, N M Anoop Krishnan, Prathosh Ap
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:59639-59667, 2025.

Abstract

Neural operators have emerged as powerful data-driven frameworks for solving Partial Differential Equations (PDEs), offering significant speedups over numerical methods. However, existing neural operators struggle with scalability in high-dimensional spaces, incur high computational costs, and face challenges in capturing continuous and long-range dependencies in PDE dynamics. To address these limitations, we introduce the Latent Mamba Operator (LaMO), which integrates the efficiency of state-space models (SSMs) in latent space with the expressive power of kernel integral formulations in neural operators. We also establish a theoretical connection between state-space models (SSMs) and the kernel integral of neural operators. Extensive experiments across diverse PDE benchmarks on regular grids, structured meshes, and point clouds covering solid and fluid physics datasets, LaMOs achieve consistent state-of-the-art (SOTA) performance, with a 32.3% improvement over existing baselines in solution operator approximation, highlighting its efficacy in modeling complex PDEs solution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-tiwari25a, title = {Latent Mamba Operator for Partial Differential Equations}, author = {Tiwari, Karn and Dutta, Niladri and Krishnan, N M Anoop and Ap, Prathosh}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {59639--59667}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/tiwari25a/tiwari25a.pdf}, url = {https://proceedings.mlr.press/v267/tiwari25a.html}, abstract = {Neural operators have emerged as powerful data-driven frameworks for solving Partial Differential Equations (PDEs), offering significant speedups over numerical methods. However, existing neural operators struggle with scalability in high-dimensional spaces, incur high computational costs, and face challenges in capturing continuous and long-range dependencies in PDE dynamics. To address these limitations, we introduce the Latent Mamba Operator (LaMO), which integrates the efficiency of state-space models (SSMs) in latent space with the expressive power of kernel integral formulations in neural operators. We also establish a theoretical connection between state-space models (SSMs) and the kernel integral of neural operators. Extensive experiments across diverse PDE benchmarks on regular grids, structured meshes, and point clouds covering solid and fluid physics datasets, LaMOs achieve consistent state-of-the-art (SOTA) performance, with a 32.3% improvement over existing baselines in solution operator approximation, highlighting its efficacy in modeling complex PDEs solution.} }
Endnote
%0 Conference Paper %T Latent Mamba Operator for Partial Differential Equations %A Karn Tiwari %A Niladri Dutta %A N M Anoop Krishnan %A Prathosh Ap %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-tiwari25a %I PMLR %P 59639--59667 %U https://proceedings.mlr.press/v267/tiwari25a.html %V 267 %X Neural operators have emerged as powerful data-driven frameworks for solving Partial Differential Equations (PDEs), offering significant speedups over numerical methods. However, existing neural operators struggle with scalability in high-dimensional spaces, incur high computational costs, and face challenges in capturing continuous and long-range dependencies in PDE dynamics. To address these limitations, we introduce the Latent Mamba Operator (LaMO), which integrates the efficiency of state-space models (SSMs) in latent space with the expressive power of kernel integral formulations in neural operators. We also establish a theoretical connection between state-space models (SSMs) and the kernel integral of neural operators. Extensive experiments across diverse PDE benchmarks on regular grids, structured meshes, and point clouds covering solid and fluid physics datasets, LaMOs achieve consistent state-of-the-art (SOTA) performance, with a 32.3% improvement over existing baselines in solution operator approximation, highlighting its efficacy in modeling complex PDEs solution.
APA
Tiwari, K., Dutta, N., Krishnan, N.M.A. & Ap, P.. (2025). Latent Mamba Operator for Partial Differential Equations. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:59639-59667 Available from https://proceedings.mlr.press/v267/tiwari25a.html.

Related Material