Infinite Hierarchical Hidden Markov Models

Katherine Heller, Yee Whye Teh, Dilan Gorur
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:224-231, 2009.

Abstract

In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-heller09a, title = {Infinite Hierarchical Hidden Markov Models}, author = {Heller, Katherine and Teh, Yee Whye and Gorur, Dilan}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {224--231}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/heller09a/heller09a.pdf}, url = {https://proceedings.mlr.press/v5/heller09a.html}, abstract = {In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.} }
Endnote
%0 Conference Paper %T Infinite Hierarchical Hidden Markov Models %A Katherine Heller %A Yee Whye Teh %A Dilan Gorur %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-heller09a %I PMLR %P 224--231 %U https://proceedings.mlr.press/v5/heller09a.html %V 5 %X In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.
RIS
TY - CPAPER TI - Infinite Hierarchical Hidden Markov Models AU - Katherine Heller AU - Yee Whye Teh AU - Dilan Gorur BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-heller09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 224 EP - 231 L1 - http://proceedings.mlr.press/v5/heller09a/heller09a.pdf UR - https://proceedings.mlr.press/v5/heller09a.html AB - In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM. ER -
APA
Heller, K., Teh, Y.W. & Gorur, D.. (2009). Infinite Hierarchical Hidden Markov Models. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:224-231 Available from https://proceedings.mlr.press/v5/heller09a.html.

Related Material