Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Ordinary Differential Equations

Jonas Beck, Nathanael Bosch, Michael Deistler, Kyra L. Kadhim, Jakob H. Macke, Philipp Hennig, Philipp Berens
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:3305-3326, 2024.

Abstract

Ordinary differential equations (ODEs) are widely used to describe dynamical systems in science, but identifying parameters that explain experimental measurements is challenging. In particular, although ODEs are differentiable and would allow for gradient-based parameter optimization, the nonlinear dynamics of ODEs often lead to many local minima and extreme sensitivity to initial conditions. We therefore propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which improves convergence of gradient-based parameter optimization in ODEs. By iteratively reducing a noise parameter of the probabilistic integrator, the proposed method converges more reliably to the true parameters. We demonstrate that our method is effective for dynamical systems of different complexity and show that it obtains reliable parameter estimates for a Hodgkin–Huxley model with a practically relevant number of parameters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-beck24a, title = {Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Ordinary Differential Equations}, author = {Beck, Jonas and Bosch, Nathanael and Deistler, Michael and Kadhim, Kyra L. and Macke, Jakob H. and Hennig, Philipp and Berens, Philipp}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {3305--3326}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/beck24a/beck24a.pdf}, url = {https://proceedings.mlr.press/v235/beck24a.html}, abstract = {Ordinary differential equations (ODEs) are widely used to describe dynamical systems in science, but identifying parameters that explain experimental measurements is challenging. In particular, although ODEs are differentiable and would allow for gradient-based parameter optimization, the nonlinear dynamics of ODEs often lead to many local minima and extreme sensitivity to initial conditions. We therefore propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which improves convergence of gradient-based parameter optimization in ODEs. By iteratively reducing a noise parameter of the probabilistic integrator, the proposed method converges more reliably to the true parameters. We demonstrate that our method is effective for dynamical systems of different complexity and show that it obtains reliable parameter estimates for a Hodgkin–Huxley model with a practically relevant number of parameters.} }
Endnote
%0 Conference Paper %T Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Ordinary Differential Equations %A Jonas Beck %A Nathanael Bosch %A Michael Deistler %A Kyra L. Kadhim %A Jakob H. Macke %A Philipp Hennig %A Philipp Berens %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-beck24a %I PMLR %P 3305--3326 %U https://proceedings.mlr.press/v235/beck24a.html %V 235 %X Ordinary differential equations (ODEs) are widely used to describe dynamical systems in science, but identifying parameters that explain experimental measurements is challenging. In particular, although ODEs are differentiable and would allow for gradient-based parameter optimization, the nonlinear dynamics of ODEs often lead to many local minima and extreme sensitivity to initial conditions. We therefore propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which improves convergence of gradient-based parameter optimization in ODEs. By iteratively reducing a noise parameter of the probabilistic integrator, the proposed method converges more reliably to the true parameters. We demonstrate that our method is effective for dynamical systems of different complexity and show that it obtains reliable parameter estimates for a Hodgkin–Huxley model with a practically relevant number of parameters.
APA
Beck, J., Bosch, N., Deistler, M., Kadhim, K.L., Macke, J.H., Hennig, P. & Berens, P.. (2024). Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Ordinary Differential Equations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:3305-3326 Available from https://proceedings.mlr.press/v235/beck24a.html.

Related Material