Latent Topic Networks: A Versatile Probabilistic Programming Framework for Topic Models

James Foulds, Shachi Kumar, Lise Getoor
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:777-786, 2015.

Abstract

Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a non-trivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-foulds15, title = {Latent Topic Networks: A Versatile Probabilistic Programming Framework for Topic Models}, author = {Foulds, James and Kumar, Shachi and Getoor, Lise}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {777--786}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/foulds15.pdf}, url = {https://proceedings.mlr.press/v37/foulds15.html}, abstract = {Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a non-trivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses.} }
Endnote
%0 Conference Paper %T Latent Topic Networks: A Versatile Probabilistic Programming Framework for Topic Models %A James Foulds %A Shachi Kumar %A Lise Getoor %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-foulds15 %I PMLR %P 777--786 %U https://proceedings.mlr.press/v37/foulds15.html %V 37 %X Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a non-trivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses.
RIS
TY - CPAPER TI - Latent Topic Networks: A Versatile Probabilistic Programming Framework for Topic Models AU - James Foulds AU - Shachi Kumar AU - Lise Getoor BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-foulds15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 777 EP - 786 L1 - http://proceedings.mlr.press/v37/foulds15.pdf UR - https://proceedings.mlr.press/v37/foulds15.html AB - Topic models have become increasingly prominent text-analytic machine learning tools for research in the social sciences and the humanities. In particular, custom topic models can be developed to answer specific research questions. The design of these models requires a non-trivial amount of effort and expertise, motivating general-purpose topic modeling frameworks. In this paper we introduce latent topic networks, a flexible class of richly structured topic models designed to facilitate applied research. Custom models can straightforwardly be developed in our framework with an intuitive first-order logical probabilistic programming language. Latent topic networks admit scalable training via a parallelizable EM algorithm which leverages ADMM in the M-step. We demonstrate the broad applicability of the models with case studies on modeling influence in citation networks, and U.S. Presidential State of the Union addresses. ER -
APA
Foulds, J., Kumar, S. & Getoor, L.. (2015). Latent Topic Networks: A Versatile Probabilistic Programming Framework for Topic Models. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:777-786 Available from https://proceedings.mlr.press/v37/foulds15.html.

Related Material