An Empirical Study of Stochastic Variational Inference Algorithms for the Beta Bernoulli Process

[edit]

Amar Shah, David Knowles, Zoubin Ghahramani ;
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1594-1603, 2015.

Abstract

Stochastic variational inference (SVI) is emerging as the most promising candidate for scaling inference in Bayesian probabilistic models to large datasets. However, the performance of these methods has been assessed primarily in the context of Bayesian topic models, particularly latent Dirichlet allocation (LDA). Deriving several new algorithms, and using synthetic, image and genomic datasets, we investigate whether the understanding gleaned from LDA applies in the setting of sparse latent factor models, specifically beta process factor analysis (BPFA). We demonstrate that the big picture is consistent: using Gibbs sampling within SVI to maintain certain posterior dependencies is extremely effective. However, we also show that different posterior dependencies are important in BPFA relative to LDA.

Related Material