Nonlinear Statistical Learning with Truncated Gaussian Graphical Models

Qinliang Su, Xuejun Liao, Changyou Chen, Lawrence Carin
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1948-1957, 2016.

Abstract

We introduce the truncated Gaussian graphical model (TGGM) as a novel framework for designing statistical models for nonlinear learning. A TGGM is a Gaussian graphical model (GGM) with a subset of variables truncated to be nonnegative. The truncated variables are assumed latent and integrated out to induce a marginal model. We show that the variables in the marginal model are non-Gaussian distributed and their expected relations are nonlinear. We use expectation-maximization to break the inference of the nonlinear model into a sequence of TGGM inference problems, each of which is efficiently solved by using the properties and numerical methods of multivariate Gaussian distributions. We use the TGGM to design models for nonlinear regression and classification, with the performances of these models demonstrated on extensive benchmark datasets and compared to state-of-the-art competing results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-su16, title = {Nonlinear Statistical Learning with Truncated Gaussian Graphical Models}, author = {Su, Qinliang and Liao, Xuejun and Chen, Changyou and Carin, Lawrence}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1948--1957}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/su16.pdf}, url = {https://proceedings.mlr.press/v48/su16.html}, abstract = {We introduce the truncated Gaussian graphical model (TGGM) as a novel framework for designing statistical models for nonlinear learning. A TGGM is a Gaussian graphical model (GGM) with a subset of variables truncated to be nonnegative. The truncated variables are assumed latent and integrated out to induce a marginal model. We show that the variables in the marginal model are non-Gaussian distributed and their expected relations are nonlinear. We use expectation-maximization to break the inference of the nonlinear model into a sequence of TGGM inference problems, each of which is efficiently solved by using the properties and numerical methods of multivariate Gaussian distributions. We use the TGGM to design models for nonlinear regression and classification, with the performances of these models demonstrated on extensive benchmark datasets and compared to state-of-the-art competing results.} }
Endnote
%0 Conference Paper %T Nonlinear Statistical Learning with Truncated Gaussian Graphical Models %A Qinliang Su %A Xuejun Liao %A Changyou Chen %A Lawrence Carin %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-su16 %I PMLR %P 1948--1957 %U https://proceedings.mlr.press/v48/su16.html %V 48 %X We introduce the truncated Gaussian graphical model (TGGM) as a novel framework for designing statistical models for nonlinear learning. A TGGM is a Gaussian graphical model (GGM) with a subset of variables truncated to be nonnegative. The truncated variables are assumed latent and integrated out to induce a marginal model. We show that the variables in the marginal model are non-Gaussian distributed and their expected relations are nonlinear. We use expectation-maximization to break the inference of the nonlinear model into a sequence of TGGM inference problems, each of which is efficiently solved by using the properties and numerical methods of multivariate Gaussian distributions. We use the TGGM to design models for nonlinear regression and classification, with the performances of these models demonstrated on extensive benchmark datasets and compared to state-of-the-art competing results.
RIS
TY - CPAPER TI - Nonlinear Statistical Learning with Truncated Gaussian Graphical Models AU - Qinliang Su AU - Xuejun Liao AU - Changyou Chen AU - Lawrence Carin BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-su16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1948 EP - 1957 L1 - http://proceedings.mlr.press/v48/su16.pdf UR - https://proceedings.mlr.press/v48/su16.html AB - We introduce the truncated Gaussian graphical model (TGGM) as a novel framework for designing statistical models for nonlinear learning. A TGGM is a Gaussian graphical model (GGM) with a subset of variables truncated to be nonnegative. The truncated variables are assumed latent and integrated out to induce a marginal model. We show that the variables in the marginal model are non-Gaussian distributed and their expected relations are nonlinear. We use expectation-maximization to break the inference of the nonlinear model into a sequence of TGGM inference problems, each of which is efficiently solved by using the properties and numerical methods of multivariate Gaussian distributions. We use the TGGM to design models for nonlinear regression and classification, with the performances of these models demonstrated on extensive benchmark datasets and compared to state-of-the-art competing results. ER -
APA
Su, Q., Liao, X., Chen, C. & Carin, L.. (2016). Nonlinear Statistical Learning with Truncated Gaussian Graphical Models. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1948-1957 Available from https://proceedings.mlr.press/v48/su16.html.

Related Material