Learning a mixture of two subspaces over finite fields

Aidao Chen, Anindya De, Aravindan Vijayaraghavan
Proceedings of the 32nd International Conference on Algorithmic Learning Theory, PMLR 132:481-504, 2021.

Abstract

We study the problem of learning a mixture of two subspaces over $\mathbb{F}_2^n$. The goal is to recover the individual subspaces, given samples from a (weighted) mixture of samples drawn uniformly from the two subspaces A_0 and A_1. This problem is computationally challenging, as it captures the notorious problem of “learning parities with noise" in the degenerate setting when $A_1 \subseteq A_0$. This is in contrast to the analogous problem over the reals that can be solved in polynomial time (Vidal’03). This leads to the following natural question: is Learning Parities with Noise the only computational barrier in obtaining efficient algorithms for learning mixtures of subspaces over $\mathbb{F}_2^n$? The main result of this paper is an affirmative answer to the above question. Namely, we show the following results: 1. When the subspaces $A_0$ and $A_1$ are incomparable, i.e., $A_0$ and $A_1$ are not contained inside each other, then there is a polynomial time algorithm to recover the subspaces $A_0$ and $A_1$. 2. In the case when $A_1$ is a subspace of $A_0$ with a significant gap in the dimension i.e., $dim(A_1) \le \alpha dim(A_0)$ for $\alpha<1$, there is a $n^{O(1/(1-\alpha))}$ time algorithm to recover the subspaces $A_0$ and $A_1$. Thus, our algorithms imply computational tractability of the problem of learning mixtures of two subspaces, except in the degenerate setting captured by learning parities with noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v132-chen21a, title = {Learning a mixture of two subspaces over finite fields}, author = {Chen, Aidao and De, Anindya and Vijayaraghavan, Aravindan}, booktitle = {Proceedings of the 32nd International Conference on Algorithmic Learning Theory}, pages = {481--504}, year = {2021}, editor = {Feldman, Vitaly and Ligett, Katrina and Sabato, Sivan}, volume = {132}, series = {Proceedings of Machine Learning Research}, month = {16--19 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v132/chen21a/chen21a.pdf}, url = {https://proceedings.mlr.press/v132/chen21a.html}, abstract = { We study the problem of learning a mixture of two subspaces over $\mathbb{F}_2^n$. The goal is to recover the individual subspaces, given samples from a (weighted) mixture of samples drawn uniformly from the two subspaces A_0 and A_1. This problem is computationally challenging, as it captures the notorious problem of “learning parities with noise" in the degenerate setting when $A_1 \subseteq A_0$. This is in contrast to the analogous problem over the reals that can be solved in polynomial time (Vidal’03). This leads to the following natural question: is Learning Parities with Noise the only computational barrier in obtaining efficient algorithms for learning mixtures of subspaces over $\mathbb{F}_2^n$? The main result of this paper is an affirmative answer to the above question. Namely, we show the following results: 1. When the subspaces $A_0$ and $A_1$ are incomparable, i.e., $A_0$ and $A_1$ are not contained inside each other, then there is a polynomial time algorithm to recover the subspaces $A_0$ and $A_1$. 2. In the case when $A_1$ is a subspace of $A_0$ with a significant gap in the dimension i.e., $dim(A_1) \le \alpha dim(A_0)$ for $\alpha<1$, there is a $n^{O(1/(1-\alpha))}$ time algorithm to recover the subspaces $A_0$ and $A_1$. Thus, our algorithms imply computational tractability of the problem of learning mixtures of two subspaces, except in the degenerate setting captured by learning parities with noise. } }
Endnote
%0 Conference Paper %T Learning a mixture of two subspaces over finite fields %A Aidao Chen %A Anindya De %A Aravindan Vijayaraghavan %B Proceedings of the 32nd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Vitaly Feldman %E Katrina Ligett %E Sivan Sabato %F pmlr-v132-chen21a %I PMLR %P 481--504 %U https://proceedings.mlr.press/v132/chen21a.html %V 132 %X We study the problem of learning a mixture of two subspaces over $\mathbb{F}_2^n$. The goal is to recover the individual subspaces, given samples from a (weighted) mixture of samples drawn uniformly from the two subspaces A_0 and A_1. This problem is computationally challenging, as it captures the notorious problem of “learning parities with noise" in the degenerate setting when $A_1 \subseteq A_0$. This is in contrast to the analogous problem over the reals that can be solved in polynomial time (Vidal’03). This leads to the following natural question: is Learning Parities with Noise the only computational barrier in obtaining efficient algorithms for learning mixtures of subspaces over $\mathbb{F}_2^n$? The main result of this paper is an affirmative answer to the above question. Namely, we show the following results: 1. When the subspaces $A_0$ and $A_1$ are incomparable, i.e., $A_0$ and $A_1$ are not contained inside each other, then there is a polynomial time algorithm to recover the subspaces $A_0$ and $A_1$. 2. In the case when $A_1$ is a subspace of $A_0$ with a significant gap in the dimension i.e., $dim(A_1) \le \alpha dim(A_0)$ for $\alpha<1$, there is a $n^{O(1/(1-\alpha))}$ time algorithm to recover the subspaces $A_0$ and $A_1$. Thus, our algorithms imply computational tractability of the problem of learning mixtures of two subspaces, except in the degenerate setting captured by learning parities with noise.
APA
Chen, A., De, A. & Vijayaraghavan, A.. (2021). Learning a mixture of two subspaces over finite fields. Proceedings of the 32nd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 132:481-504 Available from https://proceedings.mlr.press/v132/chen21a.html.

Related Material