Stochastic Variational Inference for Dynamic Correlated Topic Models

Federico Tomasi, Praveen Chandar, Gal Levy-Fix, Mounia Lalmas-Roelleke, Zhenwen Dai
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:859-868, 2020.

Abstract

Correlated topic models (CTM) are useful tools for statistical analysis of documents. They explicitly capture the correlation between topics associated with each document. We propose an extension to CTM that models the evolution of both topic correlation and word co-occurrence over time. This allows us to identify the changes of topic correlations over time, e.g., in the machine learning literature, the correlation between the topics “stochastic gradient descent” and “variational inference” increased in the last few years due to advances in stochastic variational inference methods. Our temporal dynamic priors are based on Gaussian processes (GPs), allowing us to capture diverse temporal behaviours such as smooth, with long-term memory, temporally concentrated, and periodic. The evolution of topic correlations is modeled through generalised Wishart processes (GWPs). We develop a stochastic variational inference method, which enables us to handle large sets of continuous temporal data. Our experiments applied to real world data demonstrate that our model can be used to effectively discover temporal patterns of topic distributions, words associated to topics and topic relationships.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-tomasi20a, title = {Stochastic Variational Inference for Dynamic Correlated Topic Models}, author = {Tomasi, Federico and Chandar, Praveen and Levy-Fix, Gal and Lalmas-Roelleke, Mounia and Dai, Zhenwen}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {859--868}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/tomasi20a/tomasi20a.pdf}, url = {https://proceedings.mlr.press/v124/tomasi20a.html}, abstract = {Correlated topic models (CTM) are useful tools for statistical analysis of documents. They explicitly capture the correlation between topics associated with each document. We propose an extension to CTM that models the evolution of both topic correlation and word co-occurrence over time. This allows us to identify the changes of topic correlations over time, e.g., in the machine learning literature, the correlation between the topics “stochastic gradient descent” and “variational inference” increased in the last few years due to advances in stochastic variational inference methods. Our temporal dynamic priors are based on Gaussian processes (GPs), allowing us to capture diverse temporal behaviours such as smooth, with long-term memory, temporally concentrated, and periodic. The evolution of topic correlations is modeled through generalised Wishart processes (GWPs). We develop a stochastic variational inference method, which enables us to handle large sets of continuous temporal data. Our experiments applied to real world data demonstrate that our model can be used to effectively discover temporal patterns of topic distributions, words associated to topics and topic relationships.} }
Endnote
%0 Conference Paper %T Stochastic Variational Inference for Dynamic Correlated Topic Models %A Federico Tomasi %A Praveen Chandar %A Gal Levy-Fix %A Mounia Lalmas-Roelleke %A Zhenwen Dai %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-tomasi20a %I PMLR %P 859--868 %U https://proceedings.mlr.press/v124/tomasi20a.html %V 124 %X Correlated topic models (CTM) are useful tools for statistical analysis of documents. They explicitly capture the correlation between topics associated with each document. We propose an extension to CTM that models the evolution of both topic correlation and word co-occurrence over time. This allows us to identify the changes of topic correlations over time, e.g., in the machine learning literature, the correlation between the topics “stochastic gradient descent” and “variational inference” increased in the last few years due to advances in stochastic variational inference methods. Our temporal dynamic priors are based on Gaussian processes (GPs), allowing us to capture diverse temporal behaviours such as smooth, with long-term memory, temporally concentrated, and periodic. The evolution of topic correlations is modeled through generalised Wishart processes (GWPs). We develop a stochastic variational inference method, which enables us to handle large sets of continuous temporal data. Our experiments applied to real world data demonstrate that our model can be used to effectively discover temporal patterns of topic distributions, words associated to topics and topic relationships.
APA
Tomasi, F., Chandar, P., Levy-Fix, G., Lalmas-Roelleke, M. & Dai, Z.. (2020). Stochastic Variational Inference for Dynamic Correlated Topic Models. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:859-868 Available from https://proceedings.mlr.press/v124/tomasi20a.html.

Related Material