Riemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks

Anas Jnini, Lorenzo Breschi, Flavio Vella
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:28304-28326, 2025.

Abstract

Divergence-free symmetric tensors (DFSTs) are fundamental in continuum mechanics, encoding conservation laws such as mass and momentum conservation. We introduce Riemann Tensor Neural Networks (RTNNs), a novel neural architecture that inherently satisfies the DFST condition to machine precision, providing a strong inductive bias for enforcing these conservation laws. We prove that RTNNs can approximate any sufficiently smooth DFST with arbitrary precision and demonstrate their effectiveness as surrogates for conservative PDEs, achieving improved accuracy across benchmarks. This work is the first to use DFSTs as an inductive bias in neural PDE surrogates and to explicitly enforce the conservation of both mass and momentum within a physics-constrained neural architecture.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-jnini25a, title = {{R}iemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks}, author = {Jnini, Anas and Breschi, Lorenzo and Vella, Flavio}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {28304--28326}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/jnini25a/jnini25a.pdf}, url = {https://proceedings.mlr.press/v267/jnini25a.html}, abstract = {Divergence-free symmetric tensors (DFSTs) are fundamental in continuum mechanics, encoding conservation laws such as mass and momentum conservation. We introduce Riemann Tensor Neural Networks (RTNNs), a novel neural architecture that inherently satisfies the DFST condition to machine precision, providing a strong inductive bias for enforcing these conservation laws. We prove that RTNNs can approximate any sufficiently smooth DFST with arbitrary precision and demonstrate their effectiveness as surrogates for conservative PDEs, achieving improved accuracy across benchmarks. This work is the first to use DFSTs as an inductive bias in neural PDE surrogates and to explicitly enforce the conservation of both mass and momentum within a physics-constrained neural architecture.} }
Endnote
%0 Conference Paper %T Riemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks %A Anas Jnini %A Lorenzo Breschi %A Flavio Vella %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-jnini25a %I PMLR %P 28304--28326 %U https://proceedings.mlr.press/v267/jnini25a.html %V 267 %X Divergence-free symmetric tensors (DFSTs) are fundamental in continuum mechanics, encoding conservation laws such as mass and momentum conservation. We introduce Riemann Tensor Neural Networks (RTNNs), a novel neural architecture that inherently satisfies the DFST condition to machine precision, providing a strong inductive bias for enforcing these conservation laws. We prove that RTNNs can approximate any sufficiently smooth DFST with arbitrary precision and demonstrate their effectiveness as surrogates for conservative PDEs, achieving improved accuracy across benchmarks. This work is the first to use DFSTs as an inductive bias in neural PDE surrogates and to explicitly enforce the conservation of both mass and momentum within a physics-constrained neural architecture.
APA
Jnini, A., Breschi, L. & Vella, F.. (2025). Riemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:28304-28326 Available from https://proceedings.mlr.press/v267/jnini25a.html.

Related Material