Hybrid variational/gibbs collapsed inference in topic models

Max Welling, Yee Whye Teh, Bert Kappen
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, PMLR R6:587-594, 2008.

Abstract

Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve test-set perplexity relative to variational inference at no computational cost.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR6-welling08a, title = {Hybrid variational/gibbs collapsed inference in topic models}, author = {Welling, Max and Teh, Yee Whye and Kappen, Bert}, booktitle = {Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence}, pages = {587--594}, year = {2008}, editor = {McAllester, David A. and Myllymäki, Petri}, volume = {R6}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/r6/main/assets/welling08a/welling08a.pdf}, url = {https://proceedings.mlr.press/r6/welling08a.html}, abstract = {Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve test-set perplexity relative to variational inference at no computational cost.}, note = {Reissued by PMLR on 09 October 2024.} }
Endnote
%0 Conference Paper %T Hybrid variational/gibbs collapsed inference in topic models %A Max Welling %A Yee Whye Teh %A Bert Kappen %B Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2008 %E David A. McAllester %E Petri Myllymäki %F pmlr-vR6-welling08a %I PMLR %P 587--594 %U https://proceedings.mlr.press/r6/welling08a.html %V R6 %X Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve test-set perplexity relative to variational inference at no computational cost. %Z Reissued by PMLR on 09 October 2024.
APA
Welling, M., Teh, Y.W. & Kappen, B.. (2008). Hybrid variational/gibbs collapsed inference in topic models. Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research R6:587-594 Available from https://proceedings.mlr.press/r6/welling08a.html. Reissued by PMLR on 09 October 2024.

Related Material