Multiphase MCMC Sampling for Parameter Inference in Nonlinear Ordinary Differential Equations

Alan Lazarus, Dirk Husmeier, Theodore Papamarkou
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1252-1260, 2018.

Abstract

Traditionally, ODE parameter inference relies on solving the system of ODEs and assessing fit of the estimated signal with the observations. However, nonlinear ODEs often do not permit closed form solutions. Using numerical methods to solve the equations results in prohibitive computational costs, particularly when one adopts a Bayesian approach in sampling parameters from a posterior distribution. With the introduction of gradient matching, we can abandon the need to numerically solve the system of equations. Inherent in these efficient procedures is an introduction of bias to the learning problem as we no longer sample based on the exact likelihood function. This paper presents a multiphase MCMC approach that attempts to close the gap between efficiency and accuracy. By sampling using a surrogate likelihood, we accelerate convergence to the stationary distribution before sampling using the exact likelihood. We demonstrate that this method combines the efficiency of gradient matching and the accuracy of the exact likelihood scheme.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-lazarus18a, title = {Multiphase MCMC Sampling for Parameter Inference in Nonlinear Ordinary Differential Equations}, author = {Lazarus, Alan and Husmeier, Dirk and Papamarkou, Theodore}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1252--1260}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/lazarus18a/lazarus18a.pdf}, url = {https://proceedings.mlr.press/v84/lazarus18a.html}, abstract = {Traditionally, ODE parameter inference relies on solving the system of ODEs and assessing fit of the estimated signal with the observations. However, nonlinear ODEs often do not permit closed form solutions. Using numerical methods to solve the equations results in prohibitive computational costs, particularly when one adopts a Bayesian approach in sampling parameters from a posterior distribution. With the introduction of gradient matching, we can abandon the need to numerically solve the system of equations. Inherent in these efficient procedures is an introduction of bias to the learning problem as we no longer sample based on the exact likelihood function. This paper presents a multiphase MCMC approach that attempts to close the gap between efficiency and accuracy. By sampling using a surrogate likelihood, we accelerate convergence to the stationary distribution before sampling using the exact likelihood. We demonstrate that this method combines the efficiency of gradient matching and the accuracy of the exact likelihood scheme.} }
Endnote
%0 Conference Paper %T Multiphase MCMC Sampling for Parameter Inference in Nonlinear Ordinary Differential Equations %A Alan Lazarus %A Dirk Husmeier %A Theodore Papamarkou %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-lazarus18a %I PMLR %P 1252--1260 %U https://proceedings.mlr.press/v84/lazarus18a.html %V 84 %X Traditionally, ODE parameter inference relies on solving the system of ODEs and assessing fit of the estimated signal with the observations. However, nonlinear ODEs often do not permit closed form solutions. Using numerical methods to solve the equations results in prohibitive computational costs, particularly when one adopts a Bayesian approach in sampling parameters from a posterior distribution. With the introduction of gradient matching, we can abandon the need to numerically solve the system of equations. Inherent in these efficient procedures is an introduction of bias to the learning problem as we no longer sample based on the exact likelihood function. This paper presents a multiphase MCMC approach that attempts to close the gap between efficiency and accuracy. By sampling using a surrogate likelihood, we accelerate convergence to the stationary distribution before sampling using the exact likelihood. We demonstrate that this method combines the efficiency of gradient matching and the accuracy of the exact likelihood scheme.
APA
Lazarus, A., Husmeier, D. & Papamarkou, T.. (2018). Multiphase MCMC Sampling for Parameter Inference in Nonlinear Ordinary Differential Equations. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1252-1260 Available from https://proceedings.mlr.press/v84/lazarus18a.html.

Related Material