Surrogate Likelihoods for Variational Annealed Importance Sampling

Martin Jankowiak, Du Phan
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:9881-9901, 2022.

Abstract

Variational inference is a powerful paradigm for approximate Bayesian inference with a number of appealing properties, including support for model learning and data subsampling. By contrast MCMC methods like Hamiltonian Monte Carlo do not share these properties but remain attractive since, contrary to parametric methods, MCMC is asymptotically unbiased. For these reasons researchers have sought to combine the strengths of both classes of algorithms, with recent approaches coming closer to realizing this vision in practice. However, supporting data subsampling in these hybrid methods can be a challenge, a shortcoming that we address by introducing a surrogate likelihood that can be learned jointly with other variational parameters. We argue theoretically that the resulting algorithm allows an intuitive trade-off between inference fidelity and computational cost. In an extensive empirical comparison we show that our method performs well in practice and that it is well-suited for black-box inference in probabilistic programming frameworks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-jankowiak22a, title = {Surrogate Likelihoods for Variational Annealed Importance Sampling}, author = {Jankowiak, Martin and Phan, Du}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {9881--9901}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/jankowiak22a/jankowiak22a.pdf}, url = {https://proceedings.mlr.press/v162/jankowiak22a.html}, abstract = {Variational inference is a powerful paradigm for approximate Bayesian inference with a number of appealing properties, including support for model learning and data subsampling. By contrast MCMC methods like Hamiltonian Monte Carlo do not share these properties but remain attractive since, contrary to parametric methods, MCMC is asymptotically unbiased. For these reasons researchers have sought to combine the strengths of both classes of algorithms, with recent approaches coming closer to realizing this vision in practice. However, supporting data subsampling in these hybrid methods can be a challenge, a shortcoming that we address by introducing a surrogate likelihood that can be learned jointly with other variational parameters. We argue theoretically that the resulting algorithm allows an intuitive trade-off between inference fidelity and computational cost. In an extensive empirical comparison we show that our method performs well in practice and that it is well-suited for black-box inference in probabilistic programming frameworks.} }
Endnote
%0 Conference Paper %T Surrogate Likelihoods for Variational Annealed Importance Sampling %A Martin Jankowiak %A Du Phan %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-jankowiak22a %I PMLR %P 9881--9901 %U https://proceedings.mlr.press/v162/jankowiak22a.html %V 162 %X Variational inference is a powerful paradigm for approximate Bayesian inference with a number of appealing properties, including support for model learning and data subsampling. By contrast MCMC methods like Hamiltonian Monte Carlo do not share these properties but remain attractive since, contrary to parametric methods, MCMC is asymptotically unbiased. For these reasons researchers have sought to combine the strengths of both classes of algorithms, with recent approaches coming closer to realizing this vision in practice. However, supporting data subsampling in these hybrid methods can be a challenge, a shortcoming that we address by introducing a surrogate likelihood that can be learned jointly with other variational parameters. We argue theoretically that the resulting algorithm allows an intuitive trade-off between inference fidelity and computational cost. In an extensive empirical comparison we show that our method performs well in practice and that it is well-suited for black-box inference in probabilistic programming frameworks.
APA
Jankowiak, M. & Phan, D.. (2022). Surrogate Likelihoods for Variational Annealed Importance Sampling. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:9881-9901 Available from https://proceedings.mlr.press/v162/jankowiak22a.html.

Related Material