Continuous time dynamic topic models

Chong Wang, David Blei, David Heckerman
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, PMLR R6:579-586, 2008.

Abstract

In this paper, we develop the continuous time dynamic topic model (cDTM). The cDTM is a dynamic topic model that uses Brownian motion to model the latent topics through a sequential collection of documents, where a "topic" is a pattern of word use that we expect to evolve over the course of the collection. We derive an efficient variational approximate inference algorithm that takes advantage of the sparsity of observations in text, a property that lets us easily handle many time points. In contrast to the cDTM, the original discrete-time dynamic topic model (dDTM) requires that time be discretized. Moreover, the complexity of variational inference for the dDTM grows quickly as time granularity increases, a drawback which limits fine-grained discretization. We demonstrate the cDTM on two news corpora, reporting both predictive perplexity and the novel task of time stamp prediction.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR6-wang08a, title = {Continuous time dynamic topic models}, author = {Wang, Chong and Blei, David and Heckerman, David}, booktitle = {Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence}, pages = {579--586}, year = {2008}, editor = {McAllester, David A. and Myllymäki, Petri}, volume = {R6}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/r6/main/assets/wang08a/wang08a.pdf}, url = {https://proceedings.mlr.press/r6/wang08a.html}, abstract = {In this paper, we develop the continuous time dynamic topic model (cDTM). The cDTM is a dynamic topic model that uses Brownian motion to model the latent topics through a sequential collection of documents, where a "topic" is a pattern of word use that we expect to evolve over the course of the collection. We derive an efficient variational approximate inference algorithm that takes advantage of the sparsity of observations in text, a property that lets us easily handle many time points. In contrast to the cDTM, the original discrete-time dynamic topic model (dDTM) requires that time be discretized. Moreover, the complexity of variational inference for the dDTM grows quickly as time granularity increases, a drawback which limits fine-grained discretization. We demonstrate the cDTM on two news corpora, reporting both predictive perplexity and the novel task of time stamp prediction.}, note = {Reissued by PMLR on 09 October 2024.} }
Endnote
%0 Conference Paper %T Continuous time dynamic topic models %A Chong Wang %A David Blei %A David Heckerman %B Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2008 %E David A. McAllester %E Petri Myllymäki %F pmlr-vR6-wang08a %I PMLR %P 579--586 %U https://proceedings.mlr.press/r6/wang08a.html %V R6 %X In this paper, we develop the continuous time dynamic topic model (cDTM). The cDTM is a dynamic topic model that uses Brownian motion to model the latent topics through a sequential collection of documents, where a "topic" is a pattern of word use that we expect to evolve over the course of the collection. We derive an efficient variational approximate inference algorithm that takes advantage of the sparsity of observations in text, a property that lets us easily handle many time points. In contrast to the cDTM, the original discrete-time dynamic topic model (dDTM) requires that time be discretized. Moreover, the complexity of variational inference for the dDTM grows quickly as time granularity increases, a drawback which limits fine-grained discretization. We demonstrate the cDTM on two news corpora, reporting both predictive perplexity and the novel task of time stamp prediction. %Z Reissued by PMLR on 09 October 2024.
APA
Wang, C., Blei, D. & Heckerman, D.. (2008). Continuous time dynamic topic models. Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research R6:579-586 Available from https://proceedings.mlr.press/r6/wang08a.html. Reissued by PMLR on 09 October 2024.

Related Material