Factorial Mixture of Gaussians and the Marginal Independence Model

Ricardo Silva, Zoubin Ghahramani
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:520-527, 2009.

Abstract

Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-silva09b, title = {Factorial Mixture of Gaussians and the Marginal Independence Model}, author = {Ricardo Silva and Zoubin Ghahramani}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {520--527}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/silva09b/silva09b.pdf}, url = {http://proceedings.mlr.press/v5/silva09b.html}, abstract = {Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.} }
Endnote
%0 Conference Paper %T Factorial Mixture of Gaussians and the Marginal Independence Model %A Ricardo Silva %A Zoubin Ghahramani %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-silva09b %I PMLR %J Proceedings of Machine Learning Research %P 520--527 %U http://proceedings.mlr.press %V 5 %W PMLR %X Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues.
RIS
TY - CPAPER TI - Factorial Mixture of Gaussians and the Marginal Independence Model AU - Ricardo Silva AU - Zoubin Ghahramani BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-silva09b PB - PMLR SP - 520 DP - PMLR EP - 527 L1 - http://proceedings.mlr.press/v5/silva09b/silva09b.pdf UR - http://proceedings.mlr.press/v5/silva09b.html AB - Marginal independence constraints play an important role in learning with graphical models. One way of parameterizing a model of marginal independencies is by building a latent variable model where two independent observed variables have no common latent source. In sparse domains, however, it might be advantageous to model the marginal observed distribution directly, without explicitly including latent variables in the model. There have been recent advances in Gaussian and binary models of marginal independence, but no models with non-linear dependencies between continuous variables has been proposed so far. In this paper, we describe how to generalize the Gaussian model of marginal independencies based on mixtures, and how to learn parameters. This requires a non-standard parameterization and raises difficult non-linear optimization issues. ER -
APA
Silva, R. & Ghahramani, Z.. (2009). Factorial Mixture of Gaussians and the Marginal Independence Model. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:520-527

Related Material