Markov Mixed Membership Models

Aonan Zhang, John Paisley
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:475-483, 2015.

Abstract

We present a Markov mixed membership model (Markov M3) for grouped data that learns a fully connected graph structure among mixing components. A key feature of Markov M3 is that it interprets the mixed membership assignment as a Markov random walk over this graph of nodes. This is in contrast to tree-structured models in which the assignment is done according to a tree structure on the mixing components. The Markov structure results in a simple parametric model that can learn a complex dependency structure between nodes, while still maintaining full conjugacy for closed-form stochastic variational inference. Empirical results demonstrate that Markov M3 performs well compared with tree structured topic models, and can learn meaningful dependency structure between topics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-zhangd15, title = {Markov Mixed Membership Models}, author = {Zhang, Aonan and Paisley, John}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {475--483}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/zhangd15.pdf}, url = {https://proceedings.mlr.press/v37/zhangd15.html}, abstract = {We present a Markov mixed membership model (Markov M3) for grouped data that learns a fully connected graph structure among mixing components. A key feature of Markov M3 is that it interprets the mixed membership assignment as a Markov random walk over this graph of nodes. This is in contrast to tree-structured models in which the assignment is done according to a tree structure on the mixing components. The Markov structure results in a simple parametric model that can learn a complex dependency structure between nodes, while still maintaining full conjugacy for closed-form stochastic variational inference. Empirical results demonstrate that Markov M3 performs well compared with tree structured topic models, and can learn meaningful dependency structure between topics.} }
Endnote
%0 Conference Paper %T Markov Mixed Membership Models %A Aonan Zhang %A John Paisley %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-zhangd15 %I PMLR %P 475--483 %U https://proceedings.mlr.press/v37/zhangd15.html %V 37 %X We present a Markov mixed membership model (Markov M3) for grouped data that learns a fully connected graph structure among mixing components. A key feature of Markov M3 is that it interprets the mixed membership assignment as a Markov random walk over this graph of nodes. This is in contrast to tree-structured models in which the assignment is done according to a tree structure on the mixing components. The Markov structure results in a simple parametric model that can learn a complex dependency structure between nodes, while still maintaining full conjugacy for closed-form stochastic variational inference. Empirical results demonstrate that Markov M3 performs well compared with tree structured topic models, and can learn meaningful dependency structure between topics.
RIS
TY - CPAPER TI - Markov Mixed Membership Models AU - Aonan Zhang AU - John Paisley BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-zhangd15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 475 EP - 483 L1 - http://proceedings.mlr.press/v37/zhangd15.pdf UR - https://proceedings.mlr.press/v37/zhangd15.html AB - We present a Markov mixed membership model (Markov M3) for grouped data that learns a fully connected graph structure among mixing components. A key feature of Markov M3 is that it interprets the mixed membership assignment as a Markov random walk over this graph of nodes. This is in contrast to tree-structured models in which the assignment is done according to a tree structure on the mixing components. The Markov structure results in a simple parametric model that can learn a complex dependency structure between nodes, while still maintaining full conjugacy for closed-form stochastic variational inference. Empirical results demonstrate that Markov M3 performs well compared with tree structured topic models, and can learn meaningful dependency structure between topics. ER -
APA
Zhang, A. & Paisley, J.. (2015). Markov Mixed Membership Models. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:475-483 Available from https://proceedings.mlr.press/v37/zhangd15.html.

Related Material