MML Mixture Modelling of Multi-state, Poisson, vonMises circular and Gaussian Distributions

Chris S. Wallace, David L. Dowe
Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, PMLR R1:529-536, 1997.

Abstract

Minimum Message Length (MML) is an invariant Bayesian point estimation technique which is also consistent and efficient. We provide a brief overview of MML inductive inference (Wallace and Boulton (1968) , Wallace and Freeman (1987)), and how it has both an information-theoretic and a Bayesian interpretation. We then outline how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob (Wallace and Boulton (1968), Wallace (1986), Wallace and Dowe (1994)) uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the num- ber of components. The message length is (to within a constant) the logarithm of the posterior probability of the theory. So, the MML theory can also be re- garded as the theory with the highest posterior probability. Snob currently assumes that variables are uncorrelated, and permits multi-variate data from Gaussian, discrete multi-state, Poisson and von Mises circular distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR1-wallace97a, title = {MML Mixture Modelling of Multi-state, Poisson, vonMises circular and Gaussian Distributions}, author = {Wallace, Chris S. and Dowe, David L.}, booktitle = {Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics}, pages = {529--536}, year = {1997}, editor = {Madigan, David and Smyth, Padhraic}, volume = {R1}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r1/wallace97a/wallace97a.pdf}, url = {https://proceedings.mlr.press/r1/wallace97a.html}, abstract = {Minimum Message Length (MML) is an invariant Bayesian point estimation technique which is also consistent and efficient. We provide a brief overview of MML inductive inference (Wallace and Boulton (1968) , Wallace and Freeman (1987)), and how it has both an information-theoretic and a Bayesian interpretation. We then outline how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob (Wallace and Boulton (1968), Wallace (1986), Wallace and Dowe (1994)) uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the num- ber of components. The message length is (to within a constant) the logarithm of the posterior probability of the theory. So, the MML theory can also be re- garded as the theory with the highest posterior probability. Snob currently assumes that variables are uncorrelated, and permits multi-variate data from Gaussian, discrete multi-state, Poisson and von Mises circular distributions.}, note = {Reissued by PMLR on 30 March 2021.} }
Endnote
%0 Conference Paper %T MML Mixture Modelling of Multi-state, Poisson, vonMises circular and Gaussian Distributions %A Chris S. Wallace %A David L. Dowe %B Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 1997 %E David Madigan %E Padhraic Smyth %F pmlr-vR1-wallace97a %I PMLR %P 529--536 %U https://proceedings.mlr.press/r1/wallace97a.html %V R1 %X Minimum Message Length (MML) is an invariant Bayesian point estimation technique which is also consistent and efficient. We provide a brief overview of MML inductive inference (Wallace and Boulton (1968) , Wallace and Freeman (1987)), and how it has both an information-theoretic and a Bayesian interpretation. We then outline how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob (Wallace and Boulton (1968), Wallace (1986), Wallace and Dowe (1994)) uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the num- ber of components. The message length is (to within a constant) the logarithm of the posterior probability of the theory. So, the MML theory can also be re- garded as the theory with the highest posterior probability. Snob currently assumes that variables are uncorrelated, and permits multi-variate data from Gaussian, discrete multi-state, Poisson and von Mises circular distributions. %Z Reissued by PMLR on 30 March 2021.
APA
Wallace, C.S. & Dowe, D.L.. (1997). MML Mixture Modelling of Multi-state, Poisson, vonMises circular and Gaussian Distributions. Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R1:529-536 Available from https://proceedings.mlr.press/r1/wallace97a.html. Reissued by PMLR on 30 March 2021.

Related Material