Online Inference of Topics with Latent Dirichlet Allocation

Kevin Canini, Lei Shi, Thomas Griffiths
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:65-72, 2009.

Abstract

Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-canini09a, title = {Online Inference of Topics with Latent Dirichlet Allocation}, author = {Kevin Canini and Lei Shi and Thomas Griffiths}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {65--72}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/canini09a/canini09a.pdf}, url = {http://proceedings.mlr.press/v5/canini09a.html}, abstract = {Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.} }
Endnote
%0 Conference Paper %T Online Inference of Topics with Latent Dirichlet Allocation %A Kevin Canini %A Lei Shi %A Thomas Griffiths %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-canini09a %I PMLR %J Proceedings of Machine Learning Research %P 65--72 %U http://proceedings.mlr.press %V 5 %W PMLR %X Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.
RIS
TY - CPAPER TI - Online Inference of Topics with Latent Dirichlet Allocation AU - Kevin Canini AU - Lei Shi AU - Thomas Griffiths BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-canini09a PB - PMLR SP - 65 DP - PMLR EP - 72 L1 - http://proceedings.mlr.press/v5/canini09a/canini09a.pdf UR - http://proceedings.mlr.press/v5/canini09a.html AB - Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms. ER -
APA
Canini, K., Shi, L. & Griffiths, T.. (2009). Online Inference of Topics with Latent Dirichlet Allocation. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:65-72

Related Material