Scalable MCMC for Mixed Membership Stochastic Blockmodels

Wenzhe Li, Sungjin Ahn, Max Welling
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:723-731, 2016.

Abstract

We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-li16d, title = {Scalable MCMC for Mixed Membership Stochastic Blockmodels}, author = {Li, Wenzhe and Ahn, Sungjin and Welling, Max}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {723--731}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/li16d.pdf}, url = {https://proceedings.mlr.press/v51/li16d.html}, abstract = {We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases.} }
Endnote
%0 Conference Paper %T Scalable MCMC for Mixed Membership Stochastic Blockmodels %A Wenzhe Li %A Sungjin Ahn %A Max Welling %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-li16d %I PMLR %P 723--731 %U https://proceedings.mlr.press/v51/li16d.html %V 51 %X We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases.
RIS
TY - CPAPER TI - Scalable MCMC for Mixed Membership Stochastic Blockmodels AU - Wenzhe Li AU - Sungjin Ahn AU - Max Welling BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-li16d PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 723 EP - 731 L1 - http://proceedings.mlr.press/v51/li16d.pdf UR - https://proceedings.mlr.press/v51/li16d.html AB - We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases. ER -
APA
Li, W., Ahn, S. & Welling, M.. (2016). Scalable MCMC for Mixed Membership Stochastic Blockmodels. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:723-731 Available from https://proceedings.mlr.press/v51/li16d.html.

Related Material