CoLoRA: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations

Jules Berman, Benjamin Peherstorfer
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:3565-3583, 2024.

Abstract

This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions. The adaptation can be either purely data-driven or via an equation-driven variational approach that provides Galerkin-optimal approximations. Because CoLoRA approximates solution fields locally in time, the rank of the weights can be kept small, which means that only few training trajectories are required offline so that CoLoRA is well suited for data-scarce regimes. Predictions with CoLoRA are orders of magnitude faster than with classical methods and their accuracy and parameter efficiency is higher compared to other neural network approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-berman24b, title = {{C}o{L}o{RA}: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations}, author = {Berman, Jules and Peherstorfer, Benjamin}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {3565--3583}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/berman24b/berman24b.pdf}, url = {https://proceedings.mlr.press/v235/berman24b.html}, abstract = {This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions. The adaptation can be either purely data-driven or via an equation-driven variational approach that provides Galerkin-optimal approximations. Because CoLoRA approximates solution fields locally in time, the rank of the weights can be kept small, which means that only few training trajectories are required offline so that CoLoRA is well suited for data-scarce regimes. Predictions with CoLoRA are orders of magnitude faster than with classical methods and their accuracy and parameter efficiency is higher compared to other neural network approaches.} }
Endnote
%0 Conference Paper %T CoLoRA: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations %A Jules Berman %A Benjamin Peherstorfer %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-berman24b %I PMLR %P 3565--3583 %U https://proceedings.mlr.press/v235/berman24b.html %V 235 %X This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions. The adaptation can be either purely data-driven or via an equation-driven variational approach that provides Galerkin-optimal approximations. Because CoLoRA approximates solution fields locally in time, the rank of the weights can be kept small, which means that only few training trajectories are required offline so that CoLoRA is well suited for data-scarce regimes. Predictions with CoLoRA are orders of magnitude faster than with classical methods and their accuracy and parameter efficiency is higher compared to other neural network approaches.
APA
Berman, J. & Peherstorfer, B.. (2024). CoLoRA: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:3565-3583 Available from https://proceedings.mlr.press/v235/berman24b.html.

Related Material