Markov Topic Models

Chong Wang, Bo Thiesson, Chris Meek, David Blei
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:583-590, 2009.

Abstract

We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-wang09b, title = {Markov Topic Models}, author = {Chong Wang and Bo Thiesson and Chris Meek and David Blei}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {583--590}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wang09b/wang09b.pdf}, url = {http://proceedings.mlr.press/v5/wang09b.html}, abstract = {We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.} }
Endnote
%0 Conference Paper %T Markov Topic Models %A Chong Wang %A Bo Thiesson %A Chris Meek %A David Blei %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-wang09b %I PMLR %J Proceedings of Machine Learning Research %P 583--590 %U http://proceedings.mlr.press %V 5 %W PMLR %X We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art.
RIS
TY - CPAPER TI - Markov Topic Models AU - Chong Wang AU - Bo Thiesson AU - Chris Meek AU - David Blei BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-wang09b PB - PMLR SP - 583 DP - PMLR EP - 590 L1 - http://proceedings.mlr.press/v5/wang09b/wang09b.pdf UR - http://proceedings.mlr.press/v5/wang09b.html AB - We develop Markov topic models (MTMs), a novel family of generative graphical models that can learn topics simultaneously from multiple corpora, such as papers from different conferences. We apply Gaussian (Markov) random fields to model the correlations of different corpora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectation-maximization. We study the performance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the state of the art. ER -
APA
Wang, C., Thiesson, B., Meek, C. & Blei, D.. (2009). Markov Topic Models. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:583-590

Related Material