LyaNet: A Lyapunov Framework for Training Neural ODEs

Ivan Dario Jimenez Rodriguez, Aaron Ames, Yisong Yue
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:18687-18703, 2022.

Abstract

We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code is available at https://github.com/ivandariojr/LyapunovLearning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-rodriguez22a, title = {{L}ya{N}et: A {L}yapunov Framework for Training Neural {ODE}s}, author = {Rodriguez, Ivan Dario Jimenez and Ames, Aaron and Yue, Yisong}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {18687--18703}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/rodriguez22a/rodriguez22a.pdf}, url = {https://proceedings.mlr.press/v162/rodriguez22a.html}, abstract = {We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code is available at https://github.com/ivandariojr/LyapunovLearning.} }
Endnote
%0 Conference Paper %T LyaNet: A Lyapunov Framework for Training Neural ODEs %A Ivan Dario Jimenez Rodriguez %A Aaron Ames %A Yisong Yue %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-rodriguez22a %I PMLR %P 18687--18703 %U https://proceedings.mlr.press/v162/rodriguez22a.html %V 162 %X We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code is available at https://github.com/ivandariojr/LyapunovLearning.
APA
Rodriguez, I.D.J., Ames, A. & Yue, Y.. (2022). LyaNet: A Lyapunov Framework for Training Neural ODEs. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:18687-18703 Available from https://proceedings.mlr.press/v162/rodriguez22a.html.

Related Material