[edit]

# MML Mixture Modelling of Multi-state, Poisson, vonMises circular and Gaussian Distributions

*Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics*, PMLR R1:529-536, 1997.

#### Abstract

Minimum Message Length (MML) is an invariant Bayesian point estimation technique which is also consistent and efficient. We provide a brief overview of MML inductive inference (Wallace and Boulton (1968) , Wallace and Freeman (1987)), and how it has both an information-theoretic and a Bayesian interpretation. We then outline how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob (Wallace and Boulton (1968), Wallace (1986), Wallace and Dowe (1994)) uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the num- ber of components. The message length is (to within a constant) the logarithm of the posterior probability of the theory. So, the MML theory can also be re- garded as the theory with the highest posterior probability. Snob currently assumes that variables are uncorrelated, and permits multi-variate data from Gaussian, discrete multi-state, Poisson and von Mises circular distributions.