Hierarchical learning of Hidden Markov Models with clustering regularization

Hui Lan, Antoni B. Chan
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:1628-1638, 2021.

Abstract

Hierarchical learning of generative models is useful for representing and interpreting complex data. For instance, one application is to learn an HMM to represent an individual’s eye fixations on a stimuli, and then cluster individuals’ HMMs to discover common eye gaze strategies. However, learning the individual representation models from observations and clustering individual models to group models are often considered as two separate tasks. In this paper, we propose a novel tree structure variational Bayesian method to learn the individual model and group model simultaneously by treating the group models as the parents of individual models, so that the individual model is learned from observations and regularized by its parents, and conversely, the parent model will be optimized to best represent its children. Due to the regularization process, our method has advantages when the number of training samples decreases. Experimental results on the synthetic datasets demonstrate the effectiveness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-lan21a, title = {Hierarchical learning of Hidden Markov Models with clustering regularization}, author = {Lan, Hui and Chan, Antoni B.}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {1628--1638}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/lan21a/lan21a.pdf}, url = {https://proceedings.mlr.press/v161/lan21a.html}, abstract = {Hierarchical learning of generative models is useful for representing and interpreting complex data. For instance, one application is to learn an HMM to represent an individual’s eye fixations on a stimuli, and then cluster individuals’ HMMs to discover common eye gaze strategies. However, learning the individual representation models from observations and clustering individual models to group models are often considered as two separate tasks. In this paper, we propose a novel tree structure variational Bayesian method to learn the individual model and group model simultaneously by treating the group models as the parents of individual models, so that the individual model is learned from observations and regularized by its parents, and conversely, the parent model will be optimized to best represent its children. Due to the regularization process, our method has advantages when the number of training samples decreases. Experimental results on the synthetic datasets demonstrate the effectiveness of the proposed method.} }
Endnote
%0 Conference Paper %T Hierarchical learning of Hidden Markov Models with clustering regularization %A Hui Lan %A Antoni B. Chan %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-lan21a %I PMLR %P 1628--1638 %U https://proceedings.mlr.press/v161/lan21a.html %V 161 %X Hierarchical learning of generative models is useful for representing and interpreting complex data. For instance, one application is to learn an HMM to represent an individual’s eye fixations on a stimuli, and then cluster individuals’ HMMs to discover common eye gaze strategies. However, learning the individual representation models from observations and clustering individual models to group models are often considered as two separate tasks. In this paper, we propose a novel tree structure variational Bayesian method to learn the individual model and group model simultaneously by treating the group models as the parents of individual models, so that the individual model is learned from observations and regularized by its parents, and conversely, the parent model will be optimized to best represent its children. Due to the regularization process, our method has advantages when the number of training samples decreases. Experimental results on the synthetic datasets demonstrate the effectiveness of the proposed method.
APA
Lan, H. & Chan, A.B.. (2021). Hierarchical learning of Hidden Markov Models with clustering regularization. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:1628-1638 Available from https://proceedings.mlr.press/v161/lan21a.html.

Related Material