Time and State Dependent Neural Delay Differential Equations

Thibault Monsel, Onofrio Semeraro, Lionel Mathelin, Guillaume Charpiat
Proceedings of the 1st ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications", PMLR 255:1-20, 2024.

Abstract

Discontinuities and delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics. These systems cannot be properly modelled and simulated with standard Ordinary Differential Equations (ODE), or data-driven approximations such as Neural Ordinary Differential Equations (NODE). To circumvent this issue, latent variables are typically introduced to solve the dynamics of the system in a higher dimensional space and obtain the solution as a projection to the original space. However, this solution lacks physical interpretability. In contrast, Delay Differential Equations (DDEs), and their data-driven approximated counterparts, naturally appear as good candidates to characterize such systems. In this work we revisit the recently proposed Neural DDE by introducing Neural State-Dependent DDE (SDDDE), a general and flexible framework that can model multiple and state- and time-dependent delays. We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems. Code is available at the repository https://github.com/thibmonsel/Time-and-State-Dependent-Neural-Delay-Differential-Equations

Cite this Paper


BibTeX
@InProceedings{pmlr-v255-monsel24a, title = {Time and State Dependent Neural Delay Differential Equations}, author = {Monsel, Thibault and Semeraro, Onofrio and Mathelin, Lionel and Charpiat, Guillaume}, booktitle = {Proceedings of the 1st ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications"}, pages = {1--20}, year = {2024}, editor = {Coelho, Cecı́lia and Zimmering, Bernd and Costa, M. Fernanda P. and Ferrás, Luı́s L. and Niggemann, Oliver}, volume = {255}, series = {Proceedings of Machine Learning Research}, month = {20 Oct}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v255/main/assets/monsel24a/monsel24a.pdf}, url = {https://proceedings.mlr.press/v255/monsel24a.html}, abstract = {Discontinuities and delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics. These systems cannot be properly modelled and simulated with standard Ordinary Differential Equations (ODE), or data-driven approximations such as Neural Ordinary Differential Equations (NODE). To circumvent this issue, latent variables are typically introduced to solve the dynamics of the system in a higher dimensional space and obtain the solution as a projection to the original space. However, this solution lacks physical interpretability. In contrast, Delay Differential Equations (DDEs), and their data-driven approximated counterparts, naturally appear as good candidates to characterize such systems. In this work we revisit the recently proposed Neural DDE by introducing Neural State-Dependent DDE (SDDDE), a general and flexible framework that can model multiple and state- and time-dependent delays. We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems. Code is available at the repository https://github.com/thibmonsel/Time-and-State-Dependent-Neural-Delay-Differential-Equations} }
Endnote
%0 Conference Paper %T Time and State Dependent Neural Delay Differential Equations %A Thibault Monsel %A Onofrio Semeraro %A Lionel Mathelin %A Guillaume Charpiat %B Proceedings of the 1st ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications" %C Proceedings of Machine Learning Research %D 2024 %E Cecı́lia Coelho %E Bernd Zimmering %E M. Fernanda P. Costa %E Luı́s L. Ferrás %E Oliver Niggemann %F pmlr-v255-monsel24a %I PMLR %P 1--20 %U https://proceedings.mlr.press/v255/monsel24a.html %V 255 %X Discontinuities and delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics. These systems cannot be properly modelled and simulated with standard Ordinary Differential Equations (ODE), or data-driven approximations such as Neural Ordinary Differential Equations (NODE). To circumvent this issue, latent variables are typically introduced to solve the dynamics of the system in a higher dimensional space and obtain the solution as a projection to the original space. However, this solution lacks physical interpretability. In contrast, Delay Differential Equations (DDEs), and their data-driven approximated counterparts, naturally appear as good candidates to characterize such systems. In this work we revisit the recently proposed Neural DDE by introducing Neural State-Dependent DDE (SDDDE), a general and flexible framework that can model multiple and state- and time-dependent delays. We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems. Code is available at the repository https://github.com/thibmonsel/Time-and-State-Dependent-Neural-Delay-Differential-Equations
APA
Monsel, T., Semeraro, O., Mathelin, L. & Charpiat, G.. (2024). Time and State Dependent Neural Delay Differential Equations. Proceedings of the 1st ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications", in Proceedings of Machine Learning Research 255:1-20 Available from https://proceedings.mlr.press/v255/monsel24a.html.

Related Material