A Bregman Proximal Viewpoint on Neural Operators

Abdel-Rahim Mezidi, Jordan Patracone, Saverio Salzo, Amaury Habrard, Massimiliano Pontil, Rémi Emonet, Marc Sebban
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:43965-43989, 2025.

Abstract

We present several advances on neural operators by viewing the action of operator layers as the minimizers of Bregman regularized optimization problems over Banach function spaces. The proposed framework allows interpreting the activation operators as Bregman proximity operators from dual to primal space. This novel viewpoint is general enough to recover classical neural operators as well as a new variant, coined Bregman neural operators, which includes the inverse activation operator and features the same expressivity of standard neural operators. Numerical experiments support the added benefits of the Bregman variant of Fourier neural operators for training deeper and more accurate models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-mezidi25a, title = {A {B}regman Proximal Viewpoint on Neural Operators}, author = {Mezidi, Abdel-Rahim and Patracone, Jordan and Salzo, Saverio and Habrard, Amaury and Pontil, Massimiliano and Emonet, R\'{e}mi and Sebban, Marc}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {43965--43989}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/mezidi25a/mezidi25a.pdf}, url = {https://proceedings.mlr.press/v267/mezidi25a.html}, abstract = {We present several advances on neural operators by viewing the action of operator layers as the minimizers of Bregman regularized optimization problems over Banach function spaces. The proposed framework allows interpreting the activation operators as Bregman proximity operators from dual to primal space. This novel viewpoint is general enough to recover classical neural operators as well as a new variant, coined Bregman neural operators, which includes the inverse activation operator and features the same expressivity of standard neural operators. Numerical experiments support the added benefits of the Bregman variant of Fourier neural operators for training deeper and more accurate models.} }
Endnote
%0 Conference Paper %T A Bregman Proximal Viewpoint on Neural Operators %A Abdel-Rahim Mezidi %A Jordan Patracone %A Saverio Salzo %A Amaury Habrard %A Massimiliano Pontil %A Rémi Emonet %A Marc Sebban %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-mezidi25a %I PMLR %P 43965--43989 %U https://proceedings.mlr.press/v267/mezidi25a.html %V 267 %X We present several advances on neural operators by viewing the action of operator layers as the minimizers of Bregman regularized optimization problems over Banach function spaces. The proposed framework allows interpreting the activation operators as Bregman proximity operators from dual to primal space. This novel viewpoint is general enough to recover classical neural operators as well as a new variant, coined Bregman neural operators, which includes the inverse activation operator and features the same expressivity of standard neural operators. Numerical experiments support the added benefits of the Bregman variant of Fourier neural operators for training deeper and more accurate models.
APA
Mezidi, A., Patracone, J., Salzo, S., Habrard, A., Pontil, M., Emonet, R. & Sebban, M.. (2025). A Bregman Proximal Viewpoint on Neural Operators. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:43965-43989 Available from https://proceedings.mlr.press/v267/mezidi25a.html.

Related Material