Infinite Hierarchical Hidden Markov Models

Katherine Heller, Yee Whye Teh, Dilan Gorur
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:224-231, 2009.

Abstract

In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-heller09a, title = {Infinite Hierarchical Hidden Markov Models}, author = {Katherine Heller and Yee Whye Teh and Dilan Gorur}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {224--231}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/heller09a/heller09a.pdf}, url = {http://proceedings.mlr.press/v5/heller09a.html}, abstract = {In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.} }
Endnote
%0 Conference Paper %T Infinite Hierarchical Hidden Markov Models %A Katherine Heller %A Yee Whye Teh %A Dilan Gorur %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-heller09a %I PMLR %J Proceedings of Machine Learning Research %P 224--231 %U http://proceedings.mlr.press %V 5 %W PMLR %X In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM.
RIS
TY - CPAPER TI - Infinite Hierarchical Hidden Markov Models AU - Katherine Heller AU - Yee Whye Teh AU - Dilan Gorur BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-heller09a PB - PMLR SP - 224 DP - PMLR EP - 231 L1 - http://proceedings.mlr.press/v5/heller09a/heller09a.pdf UR - http://proceedings.mlr.press/v5/heller09a.html AB - In this paper we present the Infinite Hierarchical Hidden Markov Model (IHHMM), a nonparametric generalization of Hierarchical Hidden Markov Models (HHMMs). HHMMs have been used for modeling sequential data in applications such as speech recognition, detecting topic transitions in video and extracting information from text. The IHHMM provides more flexible modeling of sequential data by allowing a potentially unbounded number of levels in the hierarchy, instead of requiring the specification of a fixed hierarchy depth. Inference and learning are performed efficiently using Gibbs sampling and a modified forward-backtrack algorithm. We show encouraging demonstrations of the workings of the IHHMM. ER -
APA
Heller, K., Teh, Y.W. & Gorur, D.. (2009). Infinite Hierarchical Hidden Markov Models. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:224-231

Related Material