SQ Lower Bounds for Random Sparse Planted Vector Problem

Jingqiu Ding, Yiding Hua
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:558-596, 2023.

Abstract

Consider the setting where a $\rho$-sparse Rademacher vector is planted in a random $d$-dimensional subspace of $R^n$. A classical question is how to recover this planted vector given a random basis in this subspace. A recent result by Zadik et al. showed that the Lattice basis reduction algorithm can recover the planted vector when $n\geq d+1$ (Zadik et al. (2021)). Although the algorithm is not expected to tolerate inverse polynomial amount of noise, it is surprising because it was previously shown that recovery cannot be achieved by low degree polynomials when $n \ll \rho^2 d^{2}$ (Mao and Wein (2021)). A natural question is whether we can derive an Statistical Query (SQ) lower bound matching the previous low degree lower bound in Mao and Wein (2021). This will (1) imply that the SQ lower bound can be surpassed by lattice based algorithms; (2) predict the computational hardness when the planted vector is perturbed by inverse polynomial amount of noise. In this paper, we prove such an SQ lower bound. In particular, we show that super-polynomial number of VSTAT queries is needed to solve the easier statistical testing problem when $n \ll \rho^2 d^{2}$ and $\rho \gg \frac{1}{\sqrt{d}}$. The most notable technique we used to derive the SQ lower bound is the almost equivalence relationship between SQ lower bound and low degree lower bound (Brennan et al. (2020); Mao and Wein (2021)).

Cite this Paper


BibTeX
@InProceedings{pmlr-v201-ding23a, title = {SQ Lower Bounds for Random Sparse Planted Vector Problem}, author = {Ding, Jingqiu and Hua, Yiding}, booktitle = {Proceedings of The 34th International Conference on Algorithmic Learning Theory}, pages = {558--596}, year = {2023}, editor = {Agrawal, Shipra and Orabona, Francesco}, volume = {201}, series = {Proceedings of Machine Learning Research}, month = {20 Feb--23 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v201/ding23a/ding23a.pdf}, url = {https://proceedings.mlr.press/v201/ding23a.html}, abstract = {Consider the setting where a $\rho$-sparse Rademacher vector is planted in a random $d$-dimensional subspace of $R^n$. A classical question is how to recover this planted vector given a random basis in this subspace. A recent result by Zadik et al. showed that the Lattice basis reduction algorithm can recover the planted vector when $n\geq d+1$ (Zadik et al. (2021)). Although the algorithm is not expected to tolerate inverse polynomial amount of noise, it is surprising because it was previously shown that recovery cannot be achieved by low degree polynomials when $n \ll \rho^2 d^{2}$ (Mao and Wein (2021)). A natural question is whether we can derive an Statistical Query (SQ) lower bound matching the previous low degree lower bound in Mao and Wein (2021). This will (1) imply that the SQ lower bound can be surpassed by lattice based algorithms; (2) predict the computational hardness when the planted vector is perturbed by inverse polynomial amount of noise. In this paper, we prove such an SQ lower bound. In particular, we show that super-polynomial number of VSTAT queries is needed to solve the easier statistical testing problem when $n \ll \rho^2 d^{2}$ and $\rho \gg \frac{1}{\sqrt{d}}$. The most notable technique we used to derive the SQ lower bound is the almost equivalence relationship between SQ lower bound and low degree lower bound (Brennan et al. (2020); Mao and Wein (2021)).} }
Endnote
%0 Conference Paper %T SQ Lower Bounds for Random Sparse Planted Vector Problem %A Jingqiu Ding %A Yiding Hua %B Proceedings of The 34th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Shipra Agrawal %E Francesco Orabona %F pmlr-v201-ding23a %I PMLR %P 558--596 %U https://proceedings.mlr.press/v201/ding23a.html %V 201 %X Consider the setting where a $\rho$-sparse Rademacher vector is planted in a random $d$-dimensional subspace of $R^n$. A classical question is how to recover this planted vector given a random basis in this subspace. A recent result by Zadik et al. showed that the Lattice basis reduction algorithm can recover the planted vector when $n\geq d+1$ (Zadik et al. (2021)). Although the algorithm is not expected to tolerate inverse polynomial amount of noise, it is surprising because it was previously shown that recovery cannot be achieved by low degree polynomials when $n \ll \rho^2 d^{2}$ (Mao and Wein (2021)). A natural question is whether we can derive an Statistical Query (SQ) lower bound matching the previous low degree lower bound in Mao and Wein (2021). This will (1) imply that the SQ lower bound can be surpassed by lattice based algorithms; (2) predict the computational hardness when the planted vector is perturbed by inverse polynomial amount of noise. In this paper, we prove such an SQ lower bound. In particular, we show that super-polynomial number of VSTAT queries is needed to solve the easier statistical testing problem when $n \ll \rho^2 d^{2}$ and $\rho \gg \frac{1}{\sqrt{d}}$. The most notable technique we used to derive the SQ lower bound is the almost equivalence relationship between SQ lower bound and low degree lower bound (Brennan et al. (2020); Mao and Wein (2021)).
APA
Ding, J. & Hua, Y.. (2023). SQ Lower Bounds for Random Sparse Planted Vector Problem. Proceedings of The 34th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 201:558-596 Available from https://proceedings.mlr.press/v201/ding23a.html.

Related Material