Stochastic Variational Inference for the HDP-HMM

Aonan Zhang, San Gultekin, John Paisley
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:800-808, 2016.

Abstract

We derive a variational inference algorithm for the HDP-HMM based on the two-level stick breaking construction. This construction has previously been applied to the hierarchical Dirichlet processes (HDP) for mixed membership models, allowing for efficient handling of the coupled weight parameters. However, the same algorithm is not directly applicable to HDP-based infinite hidden Markov models (HDP-HMM) because of extra sequential dependencies in the Markov chain. In this paper we provide a solution to this problem by deriving a variational inference algorithm for the HDP-HMM, as well as its stochastic extension, for which all parameter updates are in closed form. We apply our algorithm to sequential text analysis and audio signal analysis, comparing our results with the beam-sampled iHMM, the parametric HMM, and other variational inference approximations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-zhang16a, title = {Stochastic Variational Inference for the HDP-HMM}, author = {Zhang, Aonan and Gultekin, San and Paisley, John}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {800--808}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/zhang16a.pdf}, url = {https://proceedings.mlr.press/v51/zhang16a.html}, abstract = {We derive a variational inference algorithm for the HDP-HMM based on the two-level stick breaking construction. This construction has previously been applied to the hierarchical Dirichlet processes (HDP) for mixed membership models, allowing for efficient handling of the coupled weight parameters. However, the same algorithm is not directly applicable to HDP-based infinite hidden Markov models (HDP-HMM) because of extra sequential dependencies in the Markov chain. In this paper we provide a solution to this problem by deriving a variational inference algorithm for the HDP-HMM, as well as its stochastic extension, for which all parameter updates are in closed form. We apply our algorithm to sequential text analysis and audio signal analysis, comparing our results with the beam-sampled iHMM, the parametric HMM, and other variational inference approximations.} }
Endnote
%0 Conference Paper %T Stochastic Variational Inference for the HDP-HMM %A Aonan Zhang %A San Gultekin %A John Paisley %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-zhang16a %I PMLR %P 800--808 %U https://proceedings.mlr.press/v51/zhang16a.html %V 51 %X We derive a variational inference algorithm for the HDP-HMM based on the two-level stick breaking construction. This construction has previously been applied to the hierarchical Dirichlet processes (HDP) for mixed membership models, allowing for efficient handling of the coupled weight parameters. However, the same algorithm is not directly applicable to HDP-based infinite hidden Markov models (HDP-HMM) because of extra sequential dependencies in the Markov chain. In this paper we provide a solution to this problem by deriving a variational inference algorithm for the HDP-HMM, as well as its stochastic extension, for which all parameter updates are in closed form. We apply our algorithm to sequential text analysis and audio signal analysis, comparing our results with the beam-sampled iHMM, the parametric HMM, and other variational inference approximations.
RIS
TY - CPAPER TI - Stochastic Variational Inference for the HDP-HMM AU - Aonan Zhang AU - San Gultekin AU - John Paisley BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-zhang16a PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 800 EP - 808 L1 - http://proceedings.mlr.press/v51/zhang16a.pdf UR - https://proceedings.mlr.press/v51/zhang16a.html AB - We derive a variational inference algorithm for the HDP-HMM based on the two-level stick breaking construction. This construction has previously been applied to the hierarchical Dirichlet processes (HDP) for mixed membership models, allowing for efficient handling of the coupled weight parameters. However, the same algorithm is not directly applicable to HDP-based infinite hidden Markov models (HDP-HMM) because of extra sequential dependencies in the Markov chain. In this paper we provide a solution to this problem by deriving a variational inference algorithm for the HDP-HMM, as well as its stochastic extension, for which all parameter updates are in closed form. We apply our algorithm to sequential text analysis and audio signal analysis, comparing our results with the beam-sampled iHMM, the parametric HMM, and other variational inference approximations. ER -
APA
Zhang, A., Gultekin, S. & Paisley, J.. (2016). Stochastic Variational Inference for the HDP-HMM. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:800-808 Available from https://proceedings.mlr.press/v51/zhang16a.html.

Related Material