Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference

Mike Wu, Noah Goodman, Stefano Ermon
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2877-2886, 2019.

Abstract

Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative, samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model. An implementation is available at https://github.com/mhw32/antithetic-vae-public.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-wu19c, title = {Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference}, author = {Wu, Mike and Goodman, Noah and Ermon, Stefano}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2877--2886}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/wu19c/wu19c.pdf}, url = {https://proceedings.mlr.press/v89/wu19c.html}, abstract = {Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative, samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model. An implementation is available at https://github.com/mhw32/antithetic-vae-public.} }
Endnote
%0 Conference Paper %T Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference %A Mike Wu %A Noah Goodman %A Stefano Ermon %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-wu19c %I PMLR %P 2877--2886 %U https://proceedings.mlr.press/v89/wu19c.html %V 89 %X Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative, samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model. An implementation is available at https://github.com/mhw32/antithetic-vae-public.
APA
Wu, M., Goodman, N. & Ermon, S.. (2019). Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2877-2886 Available from https://proceedings.mlr.press/v89/wu19c.html.

Related Material