Stochastic Variational Inference for Bayesian Time Series Models

Matthew Johnson, Alan Willsky
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1854-1862, 2014.

Abstract

Bayesian models provide powerful tools for analyzing complex time series data, but performing inference with large datasets is a challenge. Stochastic variational inference (SVI) provides a new framework for approximating model posteriors with only a small number of passes through the data, enabling such models to be fit at scale. However, its application to time series models has not been studied. In this paper we develop SVI algorithms for several common Bayesian time series models, namely the hidden Markov model (HMM), hidden semi-Markov model (HSMM), and the nonparametric HDP-HMM and HDP-HSMM. In addition, because HSMM inference can be expensive even in the minibatch setting of SVI, we develop fast approximate updates for HSMMs with durations distributions that are negative binomials or mixtures of negative binomials.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-johnson14, title = {Stochastic Variational Inference for Bayesian Time Series Models}, author = {Johnson, Matthew and Willsky, Alan}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1854--1862}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/johnson14.pdf}, url = {https://proceedings.mlr.press/v32/johnson14.html}, abstract = {Bayesian models provide powerful tools for analyzing complex time series data, but performing inference with large datasets is a challenge. Stochastic variational inference (SVI) provides a new framework for approximating model posteriors with only a small number of passes through the data, enabling such models to be fit at scale. However, its application to time series models has not been studied. In this paper we develop SVI algorithms for several common Bayesian time series models, namely the hidden Markov model (HMM), hidden semi-Markov model (HSMM), and the nonparametric HDP-HMM and HDP-HSMM. In addition, because HSMM inference can be expensive even in the minibatch setting of SVI, we develop fast approximate updates for HSMMs with durations distributions that are negative binomials or mixtures of negative binomials.} }
Endnote
%0 Conference Paper %T Stochastic Variational Inference for Bayesian Time Series Models %A Matthew Johnson %A Alan Willsky %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-johnson14 %I PMLR %P 1854--1862 %U https://proceedings.mlr.press/v32/johnson14.html %V 32 %N 2 %X Bayesian models provide powerful tools for analyzing complex time series data, but performing inference with large datasets is a challenge. Stochastic variational inference (SVI) provides a new framework for approximating model posteriors with only a small number of passes through the data, enabling such models to be fit at scale. However, its application to time series models has not been studied. In this paper we develop SVI algorithms for several common Bayesian time series models, namely the hidden Markov model (HMM), hidden semi-Markov model (HSMM), and the nonparametric HDP-HMM and HDP-HSMM. In addition, because HSMM inference can be expensive even in the minibatch setting of SVI, we develop fast approximate updates for HSMMs with durations distributions that are negative binomials or mixtures of negative binomials.
RIS
TY - CPAPER TI - Stochastic Variational Inference for Bayesian Time Series Models AU - Matthew Johnson AU - Alan Willsky BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-johnson14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1854 EP - 1862 L1 - http://proceedings.mlr.press/v32/johnson14.pdf UR - https://proceedings.mlr.press/v32/johnson14.html AB - Bayesian models provide powerful tools for analyzing complex time series data, but performing inference with large datasets is a challenge. Stochastic variational inference (SVI) provides a new framework for approximating model posteriors with only a small number of passes through the data, enabling such models to be fit at scale. However, its application to time series models has not been studied. In this paper we develop SVI algorithms for several common Bayesian time series models, namely the hidden Markov model (HMM), hidden semi-Markov model (HSMM), and the nonparametric HDP-HMM and HDP-HSMM. In addition, because HSMM inference can be expensive even in the minibatch setting of SVI, we develop fast approximate updates for HSMMs with durations distributions that are negative binomials or mixtures of negative binomials. ER -
APA
Johnson, M. & Willsky, A.. (2014). Stochastic Variational Inference for Bayesian Time Series Models. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1854-1862 Available from https://proceedings.mlr.press/v32/johnson14.html.

Related Material