An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process

Amar Shah, David Knowles, Zoubin Ghahramani
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1594-1603, 2015.

Abstract

Stochastic variational inference (SVI) is emerging as the most promising candidate for scaling inference in Bayesian probabilistic models to large datasets. However, the performance of these methods has been assessed primarily in the context of Bayesian topic models, particularly latent Dirichlet allocation (LDA). Deriving several new algorithms, and using synthetic, image and genomic datasets, we investigate whether the understanding gleaned from LDA applies in the setting of sparse latent factor models, specifically beta process factor analysis (BPFA). We demonstrate that the big picture is consistent: using Gibbs sampling within SVI to maintain certain posterior dependencies is extremely effective. However, we also show that different posterior dependencies are important in BPFA relative to LDA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-shahb15, title = {An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process}, author = {Shah, Amar and Knowles, David and Ghahramani, Zoubin}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1594--1603}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/shahb15.pdf}, url = {https://proceedings.mlr.press/v37/shahb15.html}, abstract = {Stochastic variational inference (SVI) is emerging as the most promising candidate for scaling inference in Bayesian probabilistic models to large datasets. However, the performance of these methods has been assessed primarily in the context of Bayesian topic models, particularly latent Dirichlet allocation (LDA). Deriving several new algorithms, and using synthetic, image and genomic datasets, we investigate whether the understanding gleaned from LDA applies in the setting of sparse latent factor models, specifically beta process factor analysis (BPFA). We demonstrate that the big picture is consistent: using Gibbs sampling within SVI to maintain certain posterior dependencies is extremely effective. However, we also show that different posterior dependencies are important in BPFA relative to LDA.} }
Endnote
%0 Conference Paper %T An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process %A Amar Shah %A David Knowles %A Zoubin Ghahramani %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-shahb15 %I PMLR %P 1594--1603 %U https://proceedings.mlr.press/v37/shahb15.html %V 37 %X Stochastic variational inference (SVI) is emerging as the most promising candidate for scaling inference in Bayesian probabilistic models to large datasets. However, the performance of these methods has been assessed primarily in the context of Bayesian topic models, particularly latent Dirichlet allocation (LDA). Deriving several new algorithms, and using synthetic, image and genomic datasets, we investigate whether the understanding gleaned from LDA applies in the setting of sparse latent factor models, specifically beta process factor analysis (BPFA). We demonstrate that the big picture is consistent: using Gibbs sampling within SVI to maintain certain posterior dependencies is extremely effective. However, we also show that different posterior dependencies are important in BPFA relative to LDA.
RIS
TY - CPAPER TI - An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process AU - Amar Shah AU - David Knowles AU - Zoubin Ghahramani BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-shahb15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1594 EP - 1603 L1 - http://proceedings.mlr.press/v37/shahb15.pdf UR - https://proceedings.mlr.press/v37/shahb15.html AB - Stochastic variational inference (SVI) is emerging as the most promising candidate for scaling inference in Bayesian probabilistic models to large datasets. However, the performance of these methods has been assessed primarily in the context of Bayesian topic models, particularly latent Dirichlet allocation (LDA). Deriving several new algorithms, and using synthetic, image and genomic datasets, we investigate whether the understanding gleaned from LDA applies in the setting of sparse latent factor models, specifically beta process factor analysis (BPFA). We demonstrate that the big picture is consistent: using Gibbs sampling within SVI to maintain certain posterior dependencies is extremely effective. However, we also show that different posterior dependencies are important in BPFA relative to LDA. ER -
APA
Shah, A., Knowles, D. & Ghahramani, Z.. (2015). An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1594-1603 Available from https://proceedings.mlr.press/v37/shahb15.html.

Related Material