Predicting Ordinary Differential Equations with Transformers

Sören Becker, Michal Klein, Alexander Neitz, Giambattista Parascandolo, Niki Kilbertus
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:1978-2002, 2023.

Abstract

We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory. We demonstrate in extensive empirical evaluations that our model performs better or on par with existing methods in terms of accurate recovery across various settings. Moreover, our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-becker23a, title = {Predicting Ordinary Differential Equations with Transformers}, author = {Becker, S\"{o}ren and Klein, Michal and Neitz, Alexander and Parascandolo, Giambattista and Kilbertus, Niki}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {1978--2002}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/becker23a/becker23a.pdf}, url = {https://proceedings.mlr.press/v202/becker23a.html}, abstract = {We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory. We demonstrate in extensive empirical evaluations that our model performs better or on par with existing methods in terms of accurate recovery across various settings. Moreover, our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.} }
Endnote
%0 Conference Paper %T Predicting Ordinary Differential Equations with Transformers %A Sören Becker %A Michal Klein %A Alexander Neitz %A Giambattista Parascandolo %A Niki Kilbertus %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-becker23a %I PMLR %P 1978--2002 %U https://proceedings.mlr.press/v202/becker23a.html %V 202 %X We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory. We demonstrate in extensive empirical evaluations that our model performs better or on par with existing methods in terms of accurate recovery across various settings. Moreover, our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
APA
Becker, S., Klein, M., Neitz, A., Parascandolo, G. & Kilbertus, N.. (2023). Predicting Ordinary Differential Equations with Transformers. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:1978-2002 Available from https://proceedings.mlr.press/v202/becker23a.html.

Related Material