Learning non-Markovian Dynamical Systems with Signature-based Encoders

Eliott Pradeleix, Rémy Hosseinkhan-Boucher, Alena Shilova, Onofrio Semeraro, Lionel Mathelin
Proceedings of the 2nd ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications", PMLR 277:1-25, 2025.

Abstract

Neural ordinary differential equations offer an effective framework for modeling dynamical systems by learning a continuous-time vector field. However, they rely on the Markovian assumption-that future states depend only on the current state-which is often untrue in real-world scenarios where the dynamics may depend on the history of past states. This limitation becomes especially evident in settings involving the continuous control of complex systems with delays and memory effects. To capture historical dependencies, existing approaches often rely on recurrent neural network (RNN)-based encoders, which are inherently discrete and struggle with continuous modeling. In addition, they may exhibit poor training behavior. In this work, we investigate the use of the signature transform as an encoder for learning non-Markovian dynamics in a continuous-time setting. The signature transform offers a continuous-time alternative with strong theoretical foundations and proven efficiency in summarizing multidimensional information in time. We integrate a signature-based encoding scheme into encoder-decoder dynamics models and demonstrate that it outperforms RNN-based alternatives in test performance on synthetic benchmarks. The code is available at : https://github.com/eliottprdlx/Signature-Encoders-For-Dynamics-Learning.git.

Cite this Paper


BibTeX
@InProceedings{pmlr-v277-pradeleix25a, title = {Learning non-Markovian Dynamical Systems with Signature-based Encoders}, author = {Pradeleix, Eliott and Hosseinkhan-Boucher, R\'{e}my and Shilova, Alena and Semeraro, Onofrio and Mathelin, Lionel}, booktitle = {Proceedings of the 2nd ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications"}, pages = {1--25}, year = {2025}, editor = {Coelho, Cecı́lia and Zimmering, Bernd and Costa, M. Fernanda P. and Ferrás, Luı́s L. and Niggemann, Oliver}, volume = {277}, series = {Proceedings of Machine Learning Research}, month = {26 Oct}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v277/main/assets/pradeleix25a/pradeleix25a.pdf}, url = {https://proceedings.mlr.press/v277/pradeleix25a.html}, abstract = {Neural ordinary differential equations offer an effective framework for modeling dynamical systems by learning a continuous-time vector field. However, they rely on the Markovian assumption-that future states depend only on the current state-which is often untrue in real-world scenarios where the dynamics may depend on the history of past states. This limitation becomes especially evident in settings involving the continuous control of complex systems with delays and memory effects. To capture historical dependencies, existing approaches often rely on recurrent neural network (RNN)-based encoders, which are inherently discrete and struggle with continuous modeling. In addition, they may exhibit poor training behavior. In this work, we investigate the use of the signature transform as an encoder for learning non-Markovian dynamics in a continuous-time setting. The signature transform offers a continuous-time alternative with strong theoretical foundations and proven efficiency in summarizing multidimensional information in time. We integrate a signature-based encoding scheme into encoder-decoder dynamics models and demonstrate that it outperforms RNN-based alternatives in test performance on synthetic benchmarks. The code is available at : https://github.com/eliottprdlx/Signature-Encoders-For-Dynamics-Learning.git.} }
Endnote
%0 Conference Paper %T Learning non-Markovian Dynamical Systems with Signature-based Encoders %A Eliott Pradeleix %A Rémy Hosseinkhan-Boucher %A Alena Shilova %A Onofrio Semeraro %A Lionel Mathelin %B Proceedings of the 2nd ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications" %C Proceedings of Machine Learning Research %D 2025 %E Cecı́lia Coelho %E Bernd Zimmering %E M. Fernanda P. Costa %E Luı́s L. Ferrás %E Oliver Niggemann %F pmlr-v277-pradeleix25a %I PMLR %P 1--25 %U https://proceedings.mlr.press/v277/pradeleix25a.html %V 277 %X Neural ordinary differential equations offer an effective framework for modeling dynamical systems by learning a continuous-time vector field. However, they rely on the Markovian assumption-that future states depend only on the current state-which is often untrue in real-world scenarios where the dynamics may depend on the history of past states. This limitation becomes especially evident in settings involving the continuous control of complex systems with delays and memory effects. To capture historical dependencies, existing approaches often rely on recurrent neural network (RNN)-based encoders, which are inherently discrete and struggle with continuous modeling. In addition, they may exhibit poor training behavior. In this work, we investigate the use of the signature transform as an encoder for learning non-Markovian dynamics in a continuous-time setting. The signature transform offers a continuous-time alternative with strong theoretical foundations and proven efficiency in summarizing multidimensional information in time. We integrate a signature-based encoding scheme into encoder-decoder dynamics models and demonstrate that it outperforms RNN-based alternatives in test performance on synthetic benchmarks. The code is available at : https://github.com/eliottprdlx/Signature-Encoders-For-Dynamics-Learning.git.
APA
Pradeleix, E., Hosseinkhan-Boucher, R., Shilova, A., Semeraro, O. & Mathelin, L.. (2025). Learning non-Markovian Dynamical Systems with Signature-based Encoders. Proceedings of the 2nd ECAI Workshop on "Machine Learning Meets Differential Equations: From Theory to Applications", in Proceedings of Machine Learning Research 277:1-25 Available from https://proceedings.mlr.press/v277/pradeleix25a.html.

Related Material