Products of Hidden Markov Models

Andrew D. Brown, Geoffrey E. Hinton
Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, PMLR R3:21-28, 2001.

Abstract

We present products of hidden Markov models (PoHMM’s), a way of combining HMM’s to form a distributed state time series model. Inference in a PoHMM is tractable and efficient. Learning of the parameters, although intractable, can be effectively done using the Product of Experts learning rule. The distributed state helps the model to explain data which has multiple causes, and the fact that each model need only explain part of the data means a PoHMM can capture longer range structure than an HMM is capable of. We show some results on modelling character strings, a simple language task and the symbolic family trees problem, which highlight these advantages.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR3-brown01a, title = {Products of Hidden Markov Models}, author = {Brown, Andrew D. and Hinton, Geoffrey E.}, booktitle = {Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics}, pages = {21--28}, year = {2001}, editor = {Richardson, Thomas S. and Jaakkola, Tommi S.}, volume = {R3}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r3/brown01a/brown01a.pdf}, url = {http://proceedings.mlr.press/r3/brown01a.html}, abstract = {We present products of hidden Markov models (PoHMM’s), a way of combining HMM’s to form a distributed state time series model. Inference in a PoHMM is tractable and efficient. Learning of the parameters, although intractable, can be effectively done using the Product of Experts learning rule. The distributed state helps the model to explain data which has multiple causes, and the fact that each model need only explain part of the data means a PoHMM can capture longer range structure than an HMM is capable of. We show some results on modelling character strings, a simple language task and the symbolic family trees problem, which highlight these advantages.}, note = {Reissued by PMLR on 31 March 2021.} }
Endnote
%0 Conference Paper %T Products of Hidden Markov Models %A Andrew D. Brown %A Geoffrey E. Hinton %B Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2001 %E Thomas S. Richardson %E Tommi S. Jaakkola %F pmlr-vR3-brown01a %I PMLR %P 21--28 %U http://proceedings.mlr.press/r3/brown01a.html %V R3 %X We present products of hidden Markov models (PoHMM’s), a way of combining HMM’s to form a distributed state time series model. Inference in a PoHMM is tractable and efficient. Learning of the parameters, although intractable, can be effectively done using the Product of Experts learning rule. The distributed state helps the model to explain data which has multiple causes, and the fact that each model need only explain part of the data means a PoHMM can capture longer range structure than an HMM is capable of. We show some results on modelling character strings, a simple language task and the symbolic family trees problem, which highlight these advantages. %Z Reissued by PMLR on 31 March 2021.
APA
Brown, A.D. & Hinton, G.E.. (2001). Products of Hidden Markov Models. Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R3:21-28 Available from http://proceedings.mlr.press/r3/brown01a.html. Reissued by PMLR on 31 March 2021.

Related Material