[edit]
Learning Overcomplete Latent Variable Models through Tensor Methods
Proceedings of The 28th Conference on Learning Theory, PMLR 40:36-112, 2015.
Abstract
We provide guarantees for learning latent variable models emphasizing on the overcomplete regime, where the dimensionality of the latent space exceeds the observed dimensionality. In particular, we consider multiview mixtures, ICA, and sparse coding models. Our main tool is a new algorithm for tensor decomposition that works in the overcomplete regime. In the semi-supervised setting, we exploit label information to get a rough estimate of the model parameters, and then refine it using the tensor method on unlabeled samples. We establish learning guarantees when the number of components scales as k=o(d^p/2), where d is the observed dimension, and p is the order of the observed moment employed in the tensor method (usually p=3,4). In the unsupervised setting, a simple initialization algorithm based on SVD of the tensor slices is proposed, and the guarantees are provided under the stricter condition that k ≤βd (where constant βcan be larger than 1). For the learning applications, we provide tight sample complexity bounds through novel covering arguments.