Provable Tensor Methods for Learning Mixtures of Generalized Linear Models

Hanie Sedghi, Majid Janzamin, Anima Anandkumar
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1223-1231, 2016.

Abstract

We consider the problem of learning mixtures of generalized linear models (GLM) which arise in classification and regression problems. Typical learning approaches such as expectation maximization (EM) or variational Bayes can get stuck in spurious local optima. In contrast, we present a tensor decomposition method which is guaranteed to correctly recover the parameters. The key insight is to employ certain feature transformations of the input, which depend on the input generative model. Specifically, we employ score function tensors of the input and compute their cross-correlation with the response variable. We establish that the decomposition of this tensor consistently recovers the parameters, under mild non-degeneracy conditions. We demonstrate that the computational and sample complexity of our method is a low order polynomial of the input and the latent dimensions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-sedghi16, title = {Provable Tensor Methods for Learning Mixtures of Generalized Linear Models}, author = {Sedghi, Hanie and Janzamin, Majid and Anandkumar, Anima}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1223--1231}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/sedghi16.pdf}, url = {http://proceedings.mlr.press/v51/sedghi16.html}, abstract = {We consider the problem of learning mixtures of generalized linear models (GLM) which arise in classification and regression problems. Typical learning approaches such as expectation maximization (EM) or variational Bayes can get stuck in spurious local optima. In contrast, we present a tensor decomposition method which is guaranteed to correctly recover the parameters. The key insight is to employ certain feature transformations of the input, which depend on the input generative model. Specifically, we employ score function tensors of the input and compute their cross-correlation with the response variable. We establish that the decomposition of this tensor consistently recovers the parameters, under mild non-degeneracy conditions. We demonstrate that the computational and sample complexity of our method is a low order polynomial of the input and the latent dimensions.} }
Endnote
%0 Conference Paper %T Provable Tensor Methods for Learning Mixtures of Generalized Linear Models %A Hanie Sedghi %A Majid Janzamin %A Anima Anandkumar %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-sedghi16 %I PMLR %P 1223--1231 %U http://proceedings.mlr.press/v51/sedghi16.html %V 51 %X We consider the problem of learning mixtures of generalized linear models (GLM) which arise in classification and regression problems. Typical learning approaches such as expectation maximization (EM) or variational Bayes can get stuck in spurious local optima. In contrast, we present a tensor decomposition method which is guaranteed to correctly recover the parameters. The key insight is to employ certain feature transformations of the input, which depend on the input generative model. Specifically, we employ score function tensors of the input and compute their cross-correlation with the response variable. We establish that the decomposition of this tensor consistently recovers the parameters, under mild non-degeneracy conditions. We demonstrate that the computational and sample complexity of our method is a low order polynomial of the input and the latent dimensions.
RIS
TY - CPAPER TI - Provable Tensor Methods for Learning Mixtures of Generalized Linear Models AU - Hanie Sedghi AU - Majid Janzamin AU - Anima Anandkumar BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-sedghi16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1223 EP - 1231 L1 - http://proceedings.mlr.press/v51/sedghi16.pdf UR - http://proceedings.mlr.press/v51/sedghi16.html AB - We consider the problem of learning mixtures of generalized linear models (GLM) which arise in classification and regression problems. Typical learning approaches such as expectation maximization (EM) or variational Bayes can get stuck in spurious local optima. In contrast, we present a tensor decomposition method which is guaranteed to correctly recover the parameters. The key insight is to employ certain feature transformations of the input, which depend on the input generative model. Specifically, we employ score function tensors of the input and compute their cross-correlation with the response variable. We establish that the decomposition of this tensor consistently recovers the parameters, under mild non-degeneracy conditions. We demonstrate that the computational and sample complexity of our method is a low order polynomial of the input and the latent dimensions. ER -
APA
Sedghi, H., Janzamin, M. & Anandkumar, A.. (2016). Provable Tensor Methods for Learning Mixtures of Generalized Linear Models. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1223-1231 Available from http://proceedings.mlr.press/v51/sedghi16.html.

Related Material