Position: Optimization in SciML Should Employ the Function Space Geometry

Johannes Müller, Marius Zeinhofer
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:36705-36722, 2024.

Abstract

We provide an infinite-dimensional view on optimization problems encountered in scientific machine learning (SciML) and advocate for the paradigm first optimize, then discretize for their solution. This amounts to first choosing an appropriate infinite-dimensional algorithm which is then discretized in a second step. To illustrate this point, we discuss recently proposed state-of-the-art algorithms for SciML applications and see that they can be derived within this framework. Hence, this perspective allows for a principled guide for the design of optimization algorithms for SciML. As the infinite-dimensional viewpoint is presently underdeveloped we formalize it here to foster the development of novel optimization algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-muller24d, title = {Position: Optimization in {S}ci{ML} Should Employ the Function Space Geometry}, author = {M\"{u}ller, Johannes and Zeinhofer, Marius}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {36705--36722}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/muller24d/muller24d.pdf}, url = {https://proceedings.mlr.press/v235/muller24d.html}, abstract = {We provide an infinite-dimensional view on optimization problems encountered in scientific machine learning (SciML) and advocate for the paradigm first optimize, then discretize for their solution. This amounts to first choosing an appropriate infinite-dimensional algorithm which is then discretized in a second step. To illustrate this point, we discuss recently proposed state-of-the-art algorithms for SciML applications and see that they can be derived within this framework. Hence, this perspective allows for a principled guide for the design of optimization algorithms for SciML. As the infinite-dimensional viewpoint is presently underdeveloped we formalize it here to foster the development of novel optimization algorithms.} }
Endnote
%0 Conference Paper %T Position: Optimization in SciML Should Employ the Function Space Geometry %A Johannes Müller %A Marius Zeinhofer %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-muller24d %I PMLR %P 36705--36722 %U https://proceedings.mlr.press/v235/muller24d.html %V 235 %X We provide an infinite-dimensional view on optimization problems encountered in scientific machine learning (SciML) and advocate for the paradigm first optimize, then discretize for their solution. This amounts to first choosing an appropriate infinite-dimensional algorithm which is then discretized in a second step. To illustrate this point, we discuss recently proposed state-of-the-art algorithms for SciML applications and see that they can be derived within this framework. Hence, this perspective allows for a principled guide for the design of optimization algorithms for SciML. As the infinite-dimensional viewpoint is presently underdeveloped we formalize it here to foster the development of novel optimization algorithms.
APA
Müller, J. & Zeinhofer, M.. (2024). Position: Optimization in SciML Should Employ the Function Space Geometry. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:36705-36722 Available from https://proceedings.mlr.press/v235/muller24d.html.

Related Material