Online Inference of Topics with Latent Dirichlet Allocation

Kevin Canini, Lei Shi, Thomas Griffiths
Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:65-72, 2009.

Abstract

Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-canini09a, title = {Online Inference of Topics with Latent Dirichlet Allocation}, author = {Canini, Kevin and Shi, Lei and Griffiths, Thomas}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {65--72}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/canini09a/canini09a.pdf}, url = {https://proceedings.mlr.press/v5/canini09a.html}, abstract = {Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.} }
Endnote
%0 Conference Paper %T Online Inference of Topics with Latent Dirichlet Allocation %A Kevin Canini %A Lei Shi %A Thomas Griffiths %B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-canini09a %I PMLR %P 65--72 %U https://proceedings.mlr.press/v5/canini09a.html %V 5 %X Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.
RIS
TY - CPAPER TI - Online Inference of Topics with Latent Dirichlet Allocation AU - Kevin Canini AU - Lei Shi AU - Thomas Griffiths BT - Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-canini09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 65 EP - 72 L1 - http://proceedings.mlr.press/v5/canini09a/canini09a.pdf UR - https://proceedings.mlr.press/v5/canini09a.html AB - Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related Rao-Blackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms. ER -
APA
Canini, K., Shi, L. & Griffiths, T.. (2009). Online Inference of Topics with Latent Dirichlet Allocation. Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:65-72 Available from https://proceedings.mlr.press/v5/canini09a.html.

Related Material