Rademacher complexity of neural ODEs via Chen-Fliess series

Joshua Hanson, Maxim Raginsky
Proceedings of the 6th Annual Learning for Dynamics & Control Conference, PMLR 242:758-769, 2024.

Abstract

We show how continuous-depth neural ODE models can be framed as single-layer, infinite-width nets using the Chen-Fliess series expansion for nonlinear ODEs. In this net, the output “weights” are taken from the signature of the control input — a tool used to represent infinite-dimensional paths as a sequence of tensors — which comprises iterated integrals of the control input over a simplex. The “features” are taken to be iterated Lie derivatives of the output function with respect to the vector fields in the controlled ODE model. The main result of this work applies this framework to derive compact expressions for the Rademacher complexity of ODE models that map an initial condition to a scalar output at some terminal time. The result leverages the straightforward analysis afforded by single-layer architectures. We conclude with some examples instantiating the bound for some specific systems and discuss potential follow-up work.

Cite this Paper


BibTeX
@InProceedings{pmlr-v242-hanson24a, title = {{R}ademacher complexity of neural {ODE}s via {C}hen-{F}liess series}, author = {Hanson, Joshua and Raginsky, Maxim}, booktitle = {Proceedings of the 6th Annual Learning for Dynamics & Control Conference}, pages = {758--769}, year = {2024}, editor = {Abate, Alessandro and Cannon, Mark and Margellos, Kostas and Papachristodoulou, Antonis}, volume = {242}, series = {Proceedings of Machine Learning Research}, month = {15--17 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v242/hanson24a/hanson24a.pdf}, url = {https://proceedings.mlr.press/v242/hanson24a.html}, abstract = {We show how continuous-depth neural ODE models can be framed as single-layer, infinite-width nets using the Chen-Fliess series expansion for nonlinear ODEs. In this net, the output “weights” are taken from the signature of the control input — a tool used to represent infinite-dimensional paths as a sequence of tensors — which comprises iterated integrals of the control input over a simplex. The “features” are taken to be iterated Lie derivatives of the output function with respect to the vector fields in the controlled ODE model. The main result of this work applies this framework to derive compact expressions for the Rademacher complexity of ODE models that map an initial condition to a scalar output at some terminal time. The result leverages the straightforward analysis afforded by single-layer architectures. We conclude with some examples instantiating the bound for some specific systems and discuss potential follow-up work.} }
Endnote
%0 Conference Paper %T Rademacher complexity of neural ODEs via Chen-Fliess series %A Joshua Hanson %A Maxim Raginsky %B Proceedings of the 6th Annual Learning for Dynamics & Control Conference %C Proceedings of Machine Learning Research %D 2024 %E Alessandro Abate %E Mark Cannon %E Kostas Margellos %E Antonis Papachristodoulou %F pmlr-v242-hanson24a %I PMLR %P 758--769 %U https://proceedings.mlr.press/v242/hanson24a.html %V 242 %X We show how continuous-depth neural ODE models can be framed as single-layer, infinite-width nets using the Chen-Fliess series expansion for nonlinear ODEs. In this net, the output “weights” are taken from the signature of the control input — a tool used to represent infinite-dimensional paths as a sequence of tensors — which comprises iterated integrals of the control input over a simplex. The “features” are taken to be iterated Lie derivatives of the output function with respect to the vector fields in the controlled ODE model. The main result of this work applies this framework to derive compact expressions for the Rademacher complexity of ODE models that map an initial condition to a scalar output at some terminal time. The result leverages the straightforward analysis afforded by single-layer architectures. We conclude with some examples instantiating the bound for some specific systems and discuss potential follow-up work.
APA
Hanson, J. & Raginsky, M.. (2024). Rademacher complexity of neural ODEs via Chen-Fliess series. Proceedings of the 6th Annual Learning for Dynamics & Control Conference, in Proceedings of Machine Learning Research 242:758-769 Available from https://proceedings.mlr.press/v242/hanson24a.html.

Related Material