Dynamic Word Embeddings

Robert Bamler, Stephan Mandt
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:380-389, 2017.

Abstract

We present a probabilistic language model for time-stamped text data which tracks the semantic evolution of individual words over time. The model represents words and contexts by latent trajectories in an embedding space. At each moment in time, the embedding vectors are inferred from a probabilistic version of word2vec [Mikolov et al., 2013]. These embedding vectors are connected in time through a latent diffusion process. We describe two scalable variational inference algorithms–skip-gram smoothing and skip-gram filtering–that allow us to train the model jointly over all times; thus learning on all data while simultaneously allowing word and context vectors to drift. Experimental results on three different corpora demonstrate that our dynamic model infers word embedding trajectories that are more interpretable and lead to higher predictive likelihoods than competing methods that are based on static models trained separately on time slices.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-bamler17a, title = {Dynamic Word Embeddings}, author = {Robert Bamler and Stephan Mandt}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {380--389}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/bamler17a/bamler17a.pdf}, url = {https://proceedings.mlr.press/v70/bamler17a.html}, abstract = {We present a probabilistic language model for time-stamped text data which tracks the semantic evolution of individual words over time. The model represents words and contexts by latent trajectories in an embedding space. At each moment in time, the embedding vectors are inferred from a probabilistic version of word2vec [Mikolov et al., 2013]. These embedding vectors are connected in time through a latent diffusion process. We describe two scalable variational inference algorithms–skip-gram smoothing and skip-gram filtering–that allow us to train the model jointly over all times; thus learning on all data while simultaneously allowing word and context vectors to drift. Experimental results on three different corpora demonstrate that our dynamic model infers word embedding trajectories that are more interpretable and lead to higher predictive likelihoods than competing methods that are based on static models trained separately on time slices.} }
Endnote
%0 Conference Paper %T Dynamic Word Embeddings %A Robert Bamler %A Stephan Mandt %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-bamler17a %I PMLR %P 380--389 %U https://proceedings.mlr.press/v70/bamler17a.html %V 70 %X We present a probabilistic language model for time-stamped text data which tracks the semantic evolution of individual words over time. The model represents words and contexts by latent trajectories in an embedding space. At each moment in time, the embedding vectors are inferred from a probabilistic version of word2vec [Mikolov et al., 2013]. These embedding vectors are connected in time through a latent diffusion process. We describe two scalable variational inference algorithms–skip-gram smoothing and skip-gram filtering–that allow us to train the model jointly over all times; thus learning on all data while simultaneously allowing word and context vectors to drift. Experimental results on three different corpora demonstrate that our dynamic model infers word embedding trajectories that are more interpretable and lead to higher predictive likelihoods than competing methods that are based on static models trained separately on time slices.
APA
Bamler, R. & Mandt, S.. (2017). Dynamic Word Embeddings. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:380-389 Available from https://proceedings.mlr.press/v70/bamler17a.html.

Related Material