Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

George Papamakarios, David Sterratt, Iain Murray
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:837-848, 2019.

Abstract

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-papamakarios19a, title = {Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows}, author = {Papamakarios, George and Sterratt, David and Murray, Iain}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {837--848}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf}, url = {http://proceedings.mlr.press/v89/papamakarios19a.html}, abstract = {We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.} }
Endnote
%0 Conference Paper %T Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows %A George Papamakarios %A David Sterratt %A Iain Murray %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-papamakarios19a %I PMLR %P 837--848 %U http://proceedings.mlr.press/v89/papamakarios19a.html %V 89 %X We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.
APA
Papamakarios, G., Sterratt, D. & Murray, I.. (2019). Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:837-848 Available from http://proceedings.mlr.press/v89/papamakarios19a.html.

Related Material