Variational Boosting: Iteratively Refining Posterior Approximations

Andrew C. Miller, Nicholas J. Foti, Ryan P. Adams
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2420-2429, 2017.

Abstract

We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing a trade-off between computation time and accuracy. We expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that the resulting posterior inferences compare favorably to existing variational algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-miller17a, title = {Variational Boosting: Iteratively Refining Posterior Approximations}, author = {Andrew C. Miller and Nicholas J. Foti and Ryan P. Adams}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2420--2429}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/miller17a/miller17a.pdf}, url = {https://proceedings.mlr.press/v70/miller17a.html}, abstract = {We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing a trade-off between computation time and accuracy. We expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that the resulting posterior inferences compare favorably to existing variational algorithms.} }
Endnote
%0 Conference Paper %T Variational Boosting: Iteratively Refining Posterior Approximations %A Andrew C. Miller %A Nicholas J. Foti %A Ryan P. Adams %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-miller17a %I PMLR %P 2420--2429 %U https://proceedings.mlr.press/v70/miller17a.html %V 70 %X We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing a trade-off between computation time and accuracy. We expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that the resulting posterior inferences compare favorably to existing variational algorithms.
APA
Miller, A.C., Foti, N.J. & Adams, R.P.. (2017). Variational Boosting: Iteratively Refining Posterior Approximations. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2420-2429 Available from https://proceedings.mlr.press/v70/miller17a.html.

Related Material