Max-Margin Infinite Hidden Markov Models

Aonan Zhang, Jun Zhu, Bo Zhang
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):315-323, 2014.

Abstract

Infinite hidden Markov models (iHMMs) are nonparametric Bayesian extensions of hidden Markov models (HMMs) with an infinite number of states. Though flexible in describing sequential data, the generative formulation of iHMMs could limit their discriminative ability in sequential prediction tasks. Our paper introduces max-margin infinite HMMs (M2iHMMs), new infinite HMMs that explore the max-margin principle for discriminative learning. By using the theory of Gibbs classifiers and data augmentation, we develop efficient beam sampling algorithms without making restricting mean-field assumptions or truncated approximation. For single variate classification, M2iHMMs reduce to a new formulation of DP mixtures of max-margin machines. Empirical results on synthetic and real data sets show that our methods obtain superior performance than other competitors in both single variate classification and sequential prediction tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-zhangb14, title = {Max-Margin Infinite Hidden Markov Models}, author = {Zhang, Aonan and Zhu, Jun and Zhang, Bo}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {315--323}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/zhangb14.pdf}, url = {https://proceedings.mlr.press/v32/zhangb14.html}, abstract = {Infinite hidden Markov models (iHMMs) are nonparametric Bayesian extensions of hidden Markov models (HMMs) with an infinite number of states. Though flexible in describing sequential data, the generative formulation of iHMMs could limit their discriminative ability in sequential prediction tasks. Our paper introduces max-margin infinite HMMs (M2iHMMs), new infinite HMMs that explore the max-margin principle for discriminative learning. By using the theory of Gibbs classifiers and data augmentation, we develop efficient beam sampling algorithms without making restricting mean-field assumptions or truncated approximation. For single variate classification, M2iHMMs reduce to a new formulation of DP mixtures of max-margin machines. Empirical results on synthetic and real data sets show that our methods obtain superior performance than other competitors in both single variate classification and sequential prediction tasks.} }
Endnote
%0 Conference Paper %T Max-Margin Infinite Hidden Markov Models %A Aonan Zhang %A Jun Zhu %A Bo Zhang %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-zhangb14 %I PMLR %P 315--323 %U https://proceedings.mlr.press/v32/zhangb14.html %V 32 %N 1 %X Infinite hidden Markov models (iHMMs) are nonparametric Bayesian extensions of hidden Markov models (HMMs) with an infinite number of states. Though flexible in describing sequential data, the generative formulation of iHMMs could limit their discriminative ability in sequential prediction tasks. Our paper introduces max-margin infinite HMMs (M2iHMMs), new infinite HMMs that explore the max-margin principle for discriminative learning. By using the theory of Gibbs classifiers and data augmentation, we develop efficient beam sampling algorithms without making restricting mean-field assumptions or truncated approximation. For single variate classification, M2iHMMs reduce to a new formulation of DP mixtures of max-margin machines. Empirical results on synthetic and real data sets show that our methods obtain superior performance than other competitors in both single variate classification and sequential prediction tasks.
RIS
TY - CPAPER TI - Max-Margin Infinite Hidden Markov Models AU - Aonan Zhang AU - Jun Zhu AU - Bo Zhang BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-zhangb14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 315 EP - 323 L1 - http://proceedings.mlr.press/v32/zhangb14.pdf UR - https://proceedings.mlr.press/v32/zhangb14.html AB - Infinite hidden Markov models (iHMMs) are nonparametric Bayesian extensions of hidden Markov models (HMMs) with an infinite number of states. Though flexible in describing sequential data, the generative formulation of iHMMs could limit their discriminative ability in sequential prediction tasks. Our paper introduces max-margin infinite HMMs (M2iHMMs), new infinite HMMs that explore the max-margin principle for discriminative learning. By using the theory of Gibbs classifiers and data augmentation, we develop efficient beam sampling algorithms without making restricting mean-field assumptions or truncated approximation. For single variate classification, M2iHMMs reduce to a new formulation of DP mixtures of max-margin machines. Empirical results on synthetic and real data sets show that our methods obtain superior performance than other competitors in both single variate classification and sequential prediction tasks. ER -
APA
Zhang, A., Zhu, J. & Zhang, B.. (2014). Max-Margin Infinite Hidden Markov Models. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):315-323 Available from https://proceedings.mlr.press/v32/zhangb14.html.

Related Material