Log Neural Controlled Differential Equations: The Lie Brackets Make A Difference

Benjamin Walker, Andrew Donald Mcleod, Tiexin Qin, Yichuan Cheng, Haoliang Li, Terry Lyons
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:49822-49844, 2024.

Abstract

The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE’s vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE’s solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-walker24a, title = {Log Neural Controlled Differential Equations: The Lie Brackets Make A Difference}, author = {Walker, Benjamin and Mcleod, Andrew Donald and Qin, Tiexin and Cheng, Yichuan and Li, Haoliang and Lyons, Terry}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {49822--49844}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/walker24a/walker24a.pdf}, url = {https://proceedings.mlr.press/v235/walker24a.html}, abstract = {The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE’s vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE’s solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.} }
Endnote
%0 Conference Paper %T Log Neural Controlled Differential Equations: The Lie Brackets Make A Difference %A Benjamin Walker %A Andrew Donald Mcleod %A Tiexin Qin %A Yichuan Cheng %A Haoliang Li %A Terry Lyons %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-walker24a %I PMLR %P 49822--49844 %U https://proceedings.mlr.press/v235/walker24a.html %V 235 %X The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE’s vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE’s solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.
APA
Walker, B., Mcleod, A.D., Qin, T., Cheng, Y., Li, H. & Lyons, T.. (2024). Log Neural Controlled Differential Equations: The Lie Brackets Make A Difference. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:49822-49844 Available from https://proceedings.mlr.press/v235/walker24a.html.

Related Material