Locally Regularized Neural Differential Equations: Some Black Boxes were meant to remain closed!

Avik Pal, Alan Edelman, Christopher Vincent Rackauckas
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:26809-26819, 2023.

Abstract

Neural Differential Equations have become an important modeling framework due to their ability to adapt to new problems automatically. Training a neural differential equation is effectively a search over a space of plausible dynamical systems. Controlling the computational cost for these models is difficult since it relies on the number of steps the adaptive solver takes. Most prior works have used higher-order methods to reduce prediction timings while greatly increasing training time or reducing both training and prediction timings by relying on specific training algorithms, which are harder to use as a drop-in replacement. In this manuscript, we use internal cost heuristics of adaptive differential equation solvers at stochastic time-points to guide the training towards learning a dynamical system that is easier to integrate. We “close the blackbox” and allow the use of our method with any sensitivity method. We perform experimental studies to compare our method to global regularization to show that we attain similar performance numbers without compromising on the flexibility of implementation. We develop two sampling strategies to trade-off between performance and training time. Our method reduces the number of function evaluations to 0.556x - 0.733x and accelerates predictions by 1.3x - 2x.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-pal23a, title = {Locally Regularized Neural Differential Equations: Some Black Boxes were meant to remain closed!}, author = {Pal, Avik and Edelman, Alan and Rackauckas, Christopher Vincent}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {26809--26819}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/pal23a/pal23a.pdf}, url = {https://proceedings.mlr.press/v202/pal23a.html}, abstract = {Neural Differential Equations have become an important modeling framework due to their ability to adapt to new problems automatically. Training a neural differential equation is effectively a search over a space of plausible dynamical systems. Controlling the computational cost for these models is difficult since it relies on the number of steps the adaptive solver takes. Most prior works have used higher-order methods to reduce prediction timings while greatly increasing training time or reducing both training and prediction timings by relying on specific training algorithms, which are harder to use as a drop-in replacement. In this manuscript, we use internal cost heuristics of adaptive differential equation solvers at stochastic time-points to guide the training towards learning a dynamical system that is easier to integrate. We “close the blackbox” and allow the use of our method with any sensitivity method. We perform experimental studies to compare our method to global regularization to show that we attain similar performance numbers without compromising on the flexibility of implementation. We develop two sampling strategies to trade-off between performance and training time. Our method reduces the number of function evaluations to 0.556x - 0.733x and accelerates predictions by 1.3x - 2x.} }
Endnote
%0 Conference Paper %T Locally Regularized Neural Differential Equations: Some Black Boxes were meant to remain closed! %A Avik Pal %A Alan Edelman %A Christopher Vincent Rackauckas %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-pal23a %I PMLR %P 26809--26819 %U https://proceedings.mlr.press/v202/pal23a.html %V 202 %X Neural Differential Equations have become an important modeling framework due to their ability to adapt to new problems automatically. Training a neural differential equation is effectively a search over a space of plausible dynamical systems. Controlling the computational cost for these models is difficult since it relies on the number of steps the adaptive solver takes. Most prior works have used higher-order methods to reduce prediction timings while greatly increasing training time or reducing both training and prediction timings by relying on specific training algorithms, which are harder to use as a drop-in replacement. In this manuscript, we use internal cost heuristics of adaptive differential equation solvers at stochastic time-points to guide the training towards learning a dynamical system that is easier to integrate. We “close the blackbox” and allow the use of our method with any sensitivity method. We perform experimental studies to compare our method to global regularization to show that we attain similar performance numbers without compromising on the flexibility of implementation. We develop two sampling strategies to trade-off between performance and training time. Our method reduces the number of function evaluations to 0.556x - 0.733x and accelerates predictions by 1.3x - 2x.
APA
Pal, A., Edelman, A. & Rackauckas, C.V.. (2023). Locally Regularized Neural Differential Equations: Some Black Boxes were meant to remain closed!. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:26809-26819 Available from https://proceedings.mlr.press/v202/pal23a.html.

Related Material