Markov Topic Models

Chong Wang, Bo Thiesson, Chris Meek, David Blei
Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:583-590, 2009.

Abstract

We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-wang09b, title = {Markov Topic Models}, author = {Wang, Chong and Thiesson, Bo and Meek, Chris and Blei, David}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {583--590}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wang09b/wang09b.pdf}, url = {https://proceedings.mlr.press/v5/wang09b.html}, abstract = {We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.} }
Endnote
%0 Conference Paper %T Markov Topic Models %A Chong Wang %A Bo Thiesson %A Chris Meek %A David Blei %B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-wang09b %I PMLR %P 583--590 %U https://proceedings.mlr.press/v5/wang09b.html %V 5 %X We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.
RIS
TY - CPAPER TI - Markov Topic Models AU - Chong Wang AU - Bo Thiesson AU - Chris Meek AU - David Blei BT - Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-wang09b PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 583 EP - 590 L1 - http://proceedings.mlr.press/v5/wang09b/wang09b.pdf UR - https://proceedings.mlr.press/v5/wang09b.html AB - We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art. ER -
APA
Wang, C., Thiesson, B., Meek, C. & Blei, D.. (2009). Markov Topic Models. Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:583-590 Available from https://proceedings.mlr.press/v5/wang09b.html.

Related Material