Collapsed Variational Bayesian Inference for Hidden Markov Models

[edit]

Pengyu Wang, Phil Blunsom ;
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:599-607, 2013.

Abstract

Approximate inference for Bayesian models is dominated by two approaches, variational Bayesian inference and Markov Chain Monte Carlo. Both approaches have their own advantages and disadvantages, and they can complement each other. Recently researchers have proposed collapsed variational Bayesian inference to combine the advantages of both. Such inference methods have been successful in several models whose hidden variables are conditionally independent given the parameters. In this paper we propose two collapsed variational Bayesian inference algorithms for hidden Markov models, a popular framework for representing time series data. We validate our algorithms on the natural language processing task of unsupervised part-of-speech induction, showing that they are both more computationally efficient than sampling, and more accurate than standard variational Bayesian inference for HMMs.

Related Material