Non-Gaussian Component Analysis via Lattice Basis Reduction

Ilias Diakonikolas, Daniel Kane
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:4535-4547, 2022.

Abstract

Non-Gaussian Component Analysis (NGCA) is the following distribution learning problem: Given i.i.d. samples from a distribution on $\R^d$ that is non-gaussian in a hidden direction $v$ and an independent standard Gaussian in the orthogonal directions, the goal is to approximate the hidden direction $v$. Prior work \citep{DKS17-sq} provided formal evidence for the existence of an information-computation tradeoff for NGCA under appropriate moment-matching conditions on the univariate non-gaussian distribution $A$. The latter result does not apply when the distribution $A$ is discrete. A natural question is whether information-computation tradeoffs persist in this setting. In this paper, we answer this question in the negative by obtaining a sample and computationally efficient algorithm for NGCA in the regime that $A$ is discrete or nearly discrete, in a well-defined technical sense. The key tool leveraged in our algorithm is the LLL method \citep{LLL82} for lattice basis reduction.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-diakonikolas22d, title = {Non-Gaussian Component Analysis via Lattice Basis Reduction}, author = {Diakonikolas, Ilias and Kane, Daniel}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {4535--4547}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/diakonikolas22d/diakonikolas22d.pdf}, url = {https://proceedings.mlr.press/v178/diakonikolas22d.html}, abstract = {Non-Gaussian Component Analysis (NGCA) is the following distribution learning problem: Given i.i.d. samples from a distribution on $\R^d$ that is non-gaussian in a hidden direction $v$ and an independent standard Gaussian in the orthogonal directions, the goal is to approximate the hidden direction $v$. Prior work \citep{DKS17-sq} provided formal evidence for the existence of an information-computation tradeoff for NGCA under appropriate moment-matching conditions on the univariate non-gaussian distribution $A$. The latter result does not apply when the distribution $A$ is discrete. A natural question is whether information-computation tradeoffs persist in this setting. In this paper, we answer this question in the negative by obtaining a sample and computationally efficient algorithm for NGCA in the regime that $A$ is discrete or nearly discrete, in a well-defined technical sense. The key tool leveraged in our algorithm is the LLL method \citep{LLL82} for lattice basis reduction.} }
Endnote
%0 Conference Paper %T Non-Gaussian Component Analysis via Lattice Basis Reduction %A Ilias Diakonikolas %A Daniel Kane %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-diakonikolas22d %I PMLR %P 4535--4547 %U https://proceedings.mlr.press/v178/diakonikolas22d.html %V 178 %X Non-Gaussian Component Analysis (NGCA) is the following distribution learning problem: Given i.i.d. samples from a distribution on $\R^d$ that is non-gaussian in a hidden direction $v$ and an independent standard Gaussian in the orthogonal directions, the goal is to approximate the hidden direction $v$. Prior work \citep{DKS17-sq} provided formal evidence for the existence of an information-computation tradeoff for NGCA under appropriate moment-matching conditions on the univariate non-gaussian distribution $A$. The latter result does not apply when the distribution $A$ is discrete. A natural question is whether information-computation tradeoffs persist in this setting. In this paper, we answer this question in the negative by obtaining a sample and computationally efficient algorithm for NGCA in the regime that $A$ is discrete or nearly discrete, in a well-defined technical sense. The key tool leveraged in our algorithm is the LLL method \citep{LLL82} for lattice basis reduction.
APA
Diakonikolas, I. & Kane, D.. (2022). Non-Gaussian Component Analysis via Lattice Basis Reduction. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:4535-4547 Available from https://proceedings.mlr.press/v178/diakonikolas22d.html.

Related Material