Mixed Memory Markov Models

Lawrence K. Saul, Michael I. Jordan
Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, PMLR R1:437-444, 1997.

Abstract

We consider how to parameterize Markov models with prohibitively large state spaces. This is done by representing the transition matrix as a convex combination-or mixtureof simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We give examples where these models may be a faithful and/or useful representation of the underlying dynamics. We also derive a set of generalized Baum-Welch updates for hidden Markov models (HMMs) that make use of this parameterization. Because these models decompose the hidden state as the Cartesian product of two or more random variables, they are well suited to the modeling of coupled time series.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR1-saul97a, title = {Mixed Memory {M}arkov Models}, author = {Saul, Lawrence K. and Jordan, Michael I.}, booktitle = {Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics}, pages = {437--444}, year = {1997}, editor = {Madigan, David and Smyth, Padhraic}, volume = {R1}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r1/saul97a/saul97a.pdf}, url = {https://proceedings.mlr.press/r1/saul97a.html}, abstract = {We consider how to parameterize Markov models with prohibitively large state spaces. This is done by representing the transition matrix as a convex combination-or mixtureof simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We give examples where these models may be a faithful and/or useful representation of the underlying dynamics. We also derive a set of generalized Baum-Welch updates for hidden Markov models (HMMs) that make use of this parameterization. Because these models decompose the hidden state as the Cartesian product of two or more random variables, they are well suited to the modeling of coupled time series.}, note = {Reissued by PMLR on 30 March 2021.} }
Endnote
%0 Conference Paper %T Mixed Memory Markov Models %A Lawrence K. Saul %A Michael I. Jordan %B Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 1997 %E David Madigan %E Padhraic Smyth %F pmlr-vR1-saul97a %I PMLR %P 437--444 %U https://proceedings.mlr.press/r1/saul97a.html %V R1 %X We consider how to parameterize Markov models with prohibitively large state spaces. This is done by representing the transition matrix as a convex combination-or mixtureof simpler dynamical models. The parameters in these models admit a simple probabilistic interpretation and can be fitted iteratively by an Expectation-Maximization (EM) procedure. We give examples where these models may be a faithful and/or useful representation of the underlying dynamics. We also derive a set of generalized Baum-Welch updates for hidden Markov models (HMMs) that make use of this parameterization. Because these models decompose the hidden state as the Cartesian product of two or more random variables, they are well suited to the modeling of coupled time series. %Z Reissued by PMLR on 30 March 2021.
APA
Saul, L.K. & Jordan, M.I.. (1997). Mixed Memory Markov Models. Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R1:437-444 Available from https://proceedings.mlr.press/r1/saul97a.html. Reissued by PMLR on 30 March 2021.

Related Material