Factorial Mixture of Gaussians and the Marginal Independence Model

[edit]

Ricardo Silva, Zoubin Ghahramani ;
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:520-527, 2009.

Abstract

Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.

Related Material