Amortized Population Gibbs Samplers with Neural Sufficient Statistics

Hao Wu, Heiko Zimmermann, Eli Sennesh, Tuan Anh Le, Jan-Willem Van De Meent
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10421-10431, 2020.

Abstract

We develop amortized population Gibbs (APG) samplers, a class of scalable methods that frame structured variational inference as adaptive importance sampling. APG samplers construct high-dimensional proposals by iterating over updates to lower-dimensional blocks of variables. We train each conditional proposal by minimizing the inclusive KL divergence with respect to the conditional posterior. To appropriately account for the size of the input data, we develop a new parameterization in terms of neural sufficient statistics. Experiments show that APG samplers can be used to train highly-structured deep generative models in an unsupervised manner, and achieve substantial improvements in inference accuracy relative to standard autoencoding variational methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-wu20h, title = {Amortized Population {G}ibbs Samplers with Neural Sufficient Statistics}, author = {Wu, Hao and Zimmermann, Heiko and Sennesh, Eli and Le, Tuan Anh and Van De Meent, Jan-Willem}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10421--10431}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/wu20h/wu20h.pdf}, url = {https://proceedings.mlr.press/v119/wu20h.html}, abstract = {We develop amortized population Gibbs (APG) samplers, a class of scalable methods that frame structured variational inference as adaptive importance sampling. APG samplers construct high-dimensional proposals by iterating over updates to lower-dimensional blocks of variables. We train each conditional proposal by minimizing the inclusive KL divergence with respect to the conditional posterior. To appropriately account for the size of the input data, we develop a new parameterization in terms of neural sufficient statistics. Experiments show that APG samplers can be used to train highly-structured deep generative models in an unsupervised manner, and achieve substantial improvements in inference accuracy relative to standard autoencoding variational methods.} }
Endnote
%0 Conference Paper %T Amortized Population Gibbs Samplers with Neural Sufficient Statistics %A Hao Wu %A Heiko Zimmermann %A Eli Sennesh %A Tuan Anh Le %A Jan-Willem Van De Meent %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-wu20h %I PMLR %P 10421--10431 %U https://proceedings.mlr.press/v119/wu20h.html %V 119 %X We develop amortized population Gibbs (APG) samplers, a class of scalable methods that frame structured variational inference as adaptive importance sampling. APG samplers construct high-dimensional proposals by iterating over updates to lower-dimensional blocks of variables. We train each conditional proposal by minimizing the inclusive KL divergence with respect to the conditional posterior. To appropriately account for the size of the input data, we develop a new parameterization in terms of neural sufficient statistics. Experiments show that APG samplers can be used to train highly-structured deep generative models in an unsupervised manner, and achieve substantial improvements in inference accuracy relative to standard autoencoding variational methods.
APA
Wu, H., Zimmermann, H., Sennesh, E., Le, T.A. & Van De Meent, J.. (2020). Amortized Population Gibbs Samplers with Neural Sufficient Statistics. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10421-10431 Available from https://proceedings.mlr.press/v119/wu20h.html.

Related Material