Relational Topic Models for Document Networks

Jonathan Chang, David Blei
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:81-88, 2009.

Abstract

We develop the relational topic model (RTM), a model of documents and the links between them. For each pair of documents, the RTM models their link as a binary random variable that is conditioned on their contents. The model can be used to summarize a network of documents, predict links between them, and predict words within them. We derive efficient inference and learning algorithms based on variational methods and evaluate the predictive performance of the RTM for large networks of scientific abstracts and web documents.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-chang09a, title = {Relational Topic Models for Document Networks}, author = {Chang, Jonathan and Blei, David}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {81--88}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/chang09a/chang09a.pdf}, url = {https://proceedings.mlr.press/v5/chang09a.html}, abstract = {We develop the relational topic model (RTM), a model of documents and the links between them. For each pair of documents, the RTM models their link as a binary random variable that is conditioned on their contents. The model can be used to summarize a network of documents, predict links between them, and predict words within them. We derive efficient inference and learning algorithms based on variational methods and evaluate the predictive performance of the RTM for large networks of scientific abstracts and web documents.} }
Endnote
%0 Conference Paper %T Relational Topic Models for Document Networks %A Jonathan Chang %A David Blei %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-chang09a %I PMLR %P 81--88 %U https://proceedings.mlr.press/v5/chang09a.html %V 5 %X We develop the relational topic model (RTM), a model of documents and the links between them. For each pair of documents, the RTM models their link as a binary random variable that is conditioned on their contents. The model can be used to summarize a network of documents, predict links between them, and predict words within them. We derive efficient inference and learning algorithms based on variational methods and evaluate the predictive performance of the RTM for large networks of scientific abstracts and web documents.
RIS
TY - CPAPER TI - Relational Topic Models for Document Networks AU - Jonathan Chang AU - David Blei BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-chang09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 81 EP - 88 L1 - http://proceedings.mlr.press/v5/chang09a/chang09a.pdf UR - https://proceedings.mlr.press/v5/chang09a.html AB - We develop the relational topic model (RTM), a model of documents and the links between them. For each pair of documents, the RTM models their link as a binary random variable that is conditioned on their contents. The model can be used to summarize a network of documents, predict links between them, and predict words within them. We derive efficient inference and learning algorithms based on variational methods and evaluate the predictive performance of the RTM for large networks of scientific abstracts and web documents. ER -
APA
Chang, J. & Blei, D.. (2009). Relational Topic Models for Document Networks. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:81-88 Available from https://proceedings.mlr.press/v5/chang09a.html.

Related Material